id
stringlengths 11
95
| author
stringlengths 3
36
| task_category
stringclasses 16
values | tags
listlengths 1
4.05k
| created_time
timestamp[s]date 2022-03-02 23:29:04
2025-03-18 02:34:30
| last_modified
timestamp[s]date 2021-05-13 19:09:22
2025-04-17 04:22:08
| downloads
int64 0
15.6M
| likes
int64 0
4.86k
| README
stringlengths 246
1.01M
| matched_task
listlengths 1
8
| matched_bigbio_names
listlengths 1
8
| is_bionlp
stringclasses 3
values | model_cards
stringlengths 0
1M
| metadata
stringlengths 2
698k
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
SEACrowd/mdeberta-v3_sea_translationese | SEACrowd | text-classification | [
"transformers",
"safetensors",
"deberta-v2",
"text-classification",
"translationese",
"classification",
"sea",
"southeast asia",
"en",
"id",
"ms",
"vi",
"th",
"lo",
"km",
"my",
"tl",
"arxiv:2406.10118",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| 2024-05-26T15:01:21 | 2024-06-18T13:06:30 | 23 | 3 | ---
language:
- en
- id
- ms
- vi
- th
- lo
- km
- my
- tl
library_name: transformers
license: apache-2.0
metrics:
- accuracy
pipeline_tag: text-classification
tags:
- translationese
- classification
- sea
- southeast asia
---
<img width="100%" alt="SEACrowd Logo" src="https://github.com/SEACrowd/.github/blob/main/profile/assets/seacrowd-email-banner-without-logo.png?raw=true">
This is our fine-tuned mDeBERTa SEA translationese classifier for the ["SEACrowd: A Multilingual Multimodal Data Hub and Benchmark Suite for Southeast Asian Languages"](https://arxiv.org/pdf/2406.10118) paper.
SEACrowd is a [collaborative initiative](https://github.com/SEACrowd) that consolidates a [comprehensive resource hub](https://seacrowd.github.io/seacrowd-catalogue/) that fills the resource gap by [providing standardized corpora](https://github.com/SEACrowd/seacrowd-datahub) in nearly 1,000 Southeast Asian (SEA) languages across three modalities.
# Model Card for Model ID
To analyze the generation quality of LLMs in SEA languages, we build a text classifier to discriminate between translationese and natural texts. We construct a translationese classification training and testing dataset using 49 and 62 data subsets, respectively, covering approximately 39.9k and 51.5k sentences across 9 SEA languages: English (eng), Indonesian (ind), Khmer (khm), Lao (lao), Burmese (mya), Filipino (fil), Thai (tha), Vietnamese (vie), and Malay (zlm).
> Our translationese vs. natural train/test data is available on [SEACrowd/sea_translationese_resampled](https://huggingface.co/datasets/SEACrowd/sea_translationese_resampled).
To fine-tune the translationese classifier, check out our [experiments repository on GitHub](https://github.com/SEACrowd/seacrowd-experiments). We use a binary label (translationese, i.e., machine-translated or human-translated, or natural, i.e., human-generated) instead of 3 labels (machine-translated, human-translated, human-generated).
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** SEACrowd
- **Funded by:** SEACrowd
- **Shared by:** SEACrowd
- **Model type:** Encoder-Only (DebertaV2ForSequenceClassification)
- **Language(s) (NLP):** eng, ind, khm, lao, mya, fil, tha, vie, zsm
- **License:** Apache 2.0
- **Finetuned from model:** microsoft/mdeberta-v3-base
### Model Sources
<!-- Provide the basic links for the model. -->
- **Paper:** https://arxiv.org/abs/2406.10118
- **Experiment:** https://github.com/SEACrowd/seacrowd-experiments
- **Data Hub:** https://github.com/SEACrowd/seacrowd-datahub
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
To discriminate between translationese and natural texts in 9 SEA languages: English (eng), Indonesian (ind), Khmer (khm), Lao (lao), Burmese (mya), Filipino (fil), Thai (tha), Vietnamese (vie), and Malay (zlm).
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
The model is developed for detecting whether a text is `human-translated`, `machine-translated`, or `natural`.
The model supports 9 languages: `eng`, `ind`, `khm`, `lao`, `mya`, `fil`, `tha`, `vie`, `zsm`
The label mapping of the model is defined as follows:
```
{0: 'Human-translated', 1: 'Machine-translated', 2: 'Natural'}
```
where both `0` and `1` correspond to translationese and `2` is natural.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
- Use in any manner that violates applicable laws or regulations (including trade compliance laws).
- Use in any other way that is prohibited by the Acceptable Use Policy and Apache 2.0 License.
- Use in languages other than the 9 supported languages.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
The model achieves 79.08% accuracy on `translationese` (combining `human-translated` and `machine-translated`) vs `natural` in our evaluation——averaged across the aforementioned SEA languages.
Users should be aware of the risks that there might be potential error produced by the model.
See [our paper](https://arxiv.org/pdf/2406.10118) for more details.
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model.
## How to Use the Model
```
tokenizer = AutoTokenizer.from_pretrained('SEACrowd/mdeberta-v3_sea_translationese')
model = AutoModelForSequenceClassification.from_pretrained('SEACrowd/mdeberta-v3_sea_translationese')
inputs = tokenizer('<INPUT_TEXT>', padding='longest', max_length=512, truncation=True)
outputs = model(**inputs)
```
## Citation
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
If you are using any resources from SEACrowd, including datasheets, dataloaders, code, etc., please cite [the following publication](https://arxiv.org/pdf/2406.10118):
```
@article{lovenia2024seacrowd,
title={SEACrowd: A Multilingual Multimodal Data Hub and Benchmark Suite for Southeast Asian Languages},
author={Holy Lovenia and Rahmad Mahendra and Salsabil Maulana Akbar and Lester James V. Miranda and Jennifer Santoso and Elyanah Aco and Akhdan Fadhilah and Jonibek Mansurov and Joseph Marvin Imperial and Onno P. Kampman and Joel Ruben Antony Moniz and Muhammad Ravi Shulthan Habibi and Frederikus Hudi and Railey Montalan and Ryan Ignatius and Joanito Agili Lopo and William Nixon and Börje F. Karlsson and James Jaya and Ryandito Diandaru and Yuze Gao and Patrick Amadeus and Bin Wang and Jan Christian Blaise Cruz and Chenxi Whitehouse and Ivan Halim Parmonangan and Maria Khelli and Wenyu Zhang and Lucky Susanto and Reynard Adha Ryanda and Sonny Lazuardi Hermawan and Dan John Velasco and Muhammad Dehan Al Kautsar and Willy Fitra Hendria and Yasmin Moslem and Noah Flynn and Muhammad Farid Adilazuarda and Haochen Li and Johanes Lee and R. Damanhuri and Shuo Sun and Muhammad Reza Qorib and Amirbek Djanibekov and Wei Qi Leong and Quyet V. Do and Niklas Muennighoff and Tanrada Pansuwan and Ilham Firdausi Putra and Yan Xu and Ngee Chia Tai and Ayu Purwarianti and Sebastian Ruder and William Tjhi and Peerat Limkonchotiwat and Alham Fikri Aji and Sedrick Keh and Genta Indra Winata and Ruochen Zhang and Fajri Koto and Zheng-Xin Yong and Samuel Cahyawijaya},
year={2024},
eprint={2406.10118},
journal={arXiv preprint arXiv: 2406.10118}
}
``` | [
"TRANSLATION"
]
| [
"CHIA"
]
| Non_BioNLP |
<img width="100%" alt="SEACrowd Logo" src="https://github.com/SEACrowd/.github/blob/main/profile/assets/seacrowd-email-banner-without-logo.png?raw=true">
This is our fine-tuned mDeBERTa SEA translationese classifier for the ["SEACrowd: A Multilingual Multimodal Data Hub and Benchmark Suite for Southeast Asian Languages"](https://arxiv.org/pdf/2406.10118) paper.
SEACrowd is a [collaborative initiative](https://github.com/SEACrowd) that consolidates a [comprehensive resource hub](https://seacrowd.github.io/seacrowd-catalogue/) that fills the resource gap by [providing standardized corpora](https://github.com/SEACrowd/seacrowd-datahub) in nearly 1,000 Southeast Asian (SEA) languages across three modalities.
# Model Card for Model ID
To analyze the generation quality of LLMs in SEA languages, we build a text classifier to discriminate between translationese and natural texts. We construct a translationese classification training and testing dataset using 49 and 62 data subsets, respectively, covering approximately 39.9k and 51.5k sentences across 9 SEA languages: English (eng), Indonesian (ind), Khmer (khm), Lao (lao), Burmese (mya), Filipino (fil), Thai (tha), Vietnamese (vie), and Malay (zlm).
> Our translationese vs. natural train/test data is available on [SEACrowd/sea_translationese_resampled](https://huggingface.co/datasets/SEACrowd/sea_translationese_resampled).
To fine-tune the translationese classifier, check out our [experiments repository on GitHub](https://github.com/SEACrowd/seacrowd-experiments). We use a binary label (translationese, i.e., machine-translated or human-translated, or natural, i.e., human-generated) instead of 3 labels (machine-translated, human-translated, human-generated).
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** SEACrowd
- **Funded by:** SEACrowd
- **Shared by:** SEACrowd
- **Model type:** Encoder-Only (DebertaV2ForSequenceClassification)
- **Language(s) (NLP):** eng, ind, khm, lao, mya, fil, tha, vie, zsm
- **License:** Apache 2.0
- **Finetuned from model:** microsoft/mdeberta-v3-base
### Model Sources
<!-- Provide the basic links for the model. -->
- **Paper:** https://arxiv.org/abs/2406.10118
- **Experiment:** https://github.com/SEACrowd/seacrowd-experiments
- **Data Hub:** https://github.com/SEACrowd/seacrowd-datahub
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
To discriminate between translationese and natural texts in 9 SEA languages: English (eng), Indonesian (ind), Khmer (khm), Lao (lao), Burmese (mya), Filipino (fil), Thai (tha), Vietnamese (vie), and Malay (zlm).
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
The model is developed for detecting whether a text is `human-translated`, `machine-translated`, or `natural`.
The model supports 9 languages: `eng`, `ind`, `khm`, `lao`, `mya`, `fil`, `tha`, `vie`, `zsm`
The label mapping of the model is defined as follows:
```
{0: 'Human-translated', 1: 'Machine-translated', 2: 'Natural'}
```
where both `0` and `1` correspond to translationese and `2` is natural.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
- Use in any manner that violates applicable laws or regulations (including trade compliance laws).
- Use in any other way that is prohibited by the Acceptable Use Policy and Apache 2.0 License.
- Use in languages other than the 9 supported languages.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
The model achieves 79.08% accuracy on `translationese` (combining `human-translated` and `machine-translated`) vs `natural` in our evaluation——averaged across the aforementioned SEA languages.
Users should be aware of the risks that there might be potential error produced by the model.
See [our paper](https://arxiv.org/pdf/2406.10118) for more details.
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model.
## How to Use the Model
```
tokenizer = AutoTokenizer.from_pretrained('SEACrowd/mdeberta-v3_sea_translationese')
model = AutoModelForSequenceClassification.from_pretrained('SEACrowd/mdeberta-v3_sea_translationese')
inputs = tokenizer('<INPUT_TEXT>', padding='longest', max_length=512, truncation=True)
outputs = model(**inputs)
```
## Citation
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
If you are using any resources from SEACrowd, including datasheets, dataloaders, code, etc., please cite [the following publication](https://arxiv.org/pdf/2406.10118):
```
@article{lovenia2024seacrowd,
title={SEACrowd: A Multilingual Multimodal Data Hub and Benchmark Suite for Southeast Asian Languages},
author={Holy Lovenia and Rahmad Mahendra and Salsabil Maulana Akbar and Lester James V. Miranda and Jennifer Santoso and Elyanah Aco and Akhdan Fadhilah and Jonibek Mansurov and Joseph Marvin Imperial and Onno P. Kampman and Joel Ruben Antony Moniz and Muhammad Ravi Shulthan Habibi and Frederikus Hudi and Railey Montalan and Ryan Ignatius and Joanito Agili Lopo and William Nixon and Börje F. Karlsson and James Jaya and Ryandito Diandaru and Yuze Gao and Patrick Amadeus and Bin Wang and Jan Christian Blaise Cruz and Chenxi Whitehouse and Ivan Halim Parmonangan and Maria Khelli and Wenyu Zhang and Lucky Susanto and Reynard Adha Ryanda and Sonny Lazuardi Hermawan and Dan John Velasco and Muhammad Dehan Al Kautsar and Willy Fitra Hendria and Yasmin Moslem and Noah Flynn and Muhammad Farid Adilazuarda and Haochen Li and Johanes Lee and R. Damanhuri and Shuo Sun and Muhammad Reza Qorib and Amirbek Djanibekov and Wei Qi Leong and Quyet V. Do and Niklas Muennighoff and Tanrada Pansuwan and Ilham Firdausi Putra and Yan Xu and Ngee Chia Tai and Ayu Purwarianti and Sebastian Ruder and William Tjhi and Peerat Limkonchotiwat and Alham Fikri Aji and Sedrick Keh and Genta Indra Winata and Ruochen Zhang and Fajri Koto and Zheng-Xin Yong and Samuel Cahyawijaya},
year={2024},
eprint={2406.10118},
journal={arXiv preprint arXiv: 2406.10118}
}
``` | {"language": ["en", "id", "ms", "vi", "th", "lo", "km", "my", "tl"], "library_name": "transformers", "license": "apache-2.0", "metrics": ["accuracy"], "pipeline_tag": "text-classification", "tags": ["translationese", "classification", "sea", "southeast asia"]} |
croissantllm/CroissantLLMChat-v0.1 | croissantllm | text-generation | [
"transformers",
"safetensors",
"llama",
"text-generation",
"legal",
"code",
"text-generation-inference",
"art",
"conversational",
"fr",
"en",
"dataset:croissantllm/croissant_dataset",
"dataset:croissantllm/CroissantLLM-2201-sft",
"dataset:cerebras/SlimPajama-627B",
"dataset:uonlp/CulturaX",
"dataset:pg19",
"dataset:bigcode/starcoderdata",
"arxiv:2402.00786",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| 2024-01-24T09:18:45 | 2024-04-26T10:02:01 | 3,614 | 50 | ---
datasets:
- croissantllm/croissant_dataset
- croissantllm/CroissantLLM-2201-sft
- cerebras/SlimPajama-627B
- uonlp/CulturaX
- pg19
- bigcode/starcoderdata
language:
- fr
- en
license: mit
pipeline_tag: text-generation
tags:
- legal
- code
- text-generation-inference
- art
---
# CroissantLLMChat (190k steps + Chat)
This model is part of the CroissantLLM initiative, and corresponds to the checkpoint after 190k steps (2.99 T) tokens and a final Chat finetuning phase.
https://arxiv.org/abs/2402.00786
For best performance, it should be used with a temperature of 0.3 or more, and with the exact template described below:
```python
chat = [
{"role": "user", "content": "Que puis-je faire à Marseille en hiver?"},
]
chat_input = tokenizer.apply_chat_template(chat, tokenize=False, add_generation_prompt=True)
```
corresponding to:
```python
chat_input = """<|im_start|>user
{USER QUERY}<|im_end|>
<|im_start|>assistant\n"""
```
## Abstract
We introduce CroissantLLM, a 1.3B language model pretrained on a set of 3T English and French tokens, to bring to the research and industrial community a high-performance, fully open-sourced bilingual model that runs swiftly on consumer-grade local hardware.
To that end, we pioneer the approach of training an intrinsically bilingual model with a 1:1 English-to-French pretraining data ratio, a custom tokenizer, and bilingual finetuning datasets. We release the training dataset, notably containing a French split with manually curated, high-quality, and varied data sources.
To assess performance outside of English, we craft a novel benchmark, FrenchBench, consisting of an array of classification and generation tasks, covering various orthogonal aspects of model performance in the French Language. Additionally, rooted in transparency and to foster further Large Language Model research, we release codebases, and dozens of checkpoints across various model sizes, training data distributions, and training steps, as well as fine-tuned Chat models, and strong translation models. We evaluate our model through the FMTI framework, and validate 81% of the transparency criteria, far beyond the scores of even most open initiatives.
This work enriches the NLP landscape, breaking away from previous English-centric work in order to strengthen our understanding of multilinguality in language models.
## Citation
Our work can be cited as:
```bash
@misc{faysse2024croissantllm,
title={CroissantLLM: A Truly Bilingual French-English Language Model},
author={Manuel Faysse and Patrick Fernandes and Nuno M. Guerreiro and António Loison and Duarte M. Alves and Caio Corro and Nicolas Boizard and João Alves and Ricardo Rei and Pedro H. Martins and Antoni Bigata Casademunt and François Yvon and André F. T. Martins and Gautier Viaud and Céline Hudelot and Pierre Colombo},
year={2024},
eprint={2402.00786},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## Usage
This model is a Chat model, that is, it is finetuned for Chat function and works best with the provided template.
#### With generate
This might require a stopping criteria on <|im_end|> token.
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "croissantllm/CroissantLLMChat-v0.1"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)
generation_args = {
"max_new_tokens": 256,
"do_sample": True,
"temperature": 0.3,
"top_p": 0.90,
"top_k": 40,
"repetition_penalty": 1.05,
"eos_token_id": [tokenizer.eos_token_id, 32000],
}
chat = [
{"role": "user", "content": "Qui est le président francais actuel ?"},
]
chat_input = tokenizer.apply_chat_template(chat, tokenize=False, add_generation_prompt=True)
inputs = tokenizer(chat_input, return_tensors="pt").to(model.device)
tokens = model.generate(**inputs, **generation_args)
print(tokenizer.decode(tokens[0]))
# print tokens individually
print([(tokenizer.decode([tok]), tok) for tok in tokens[0].tolist()])
```
## Model limitations
Evaluation results indicate the model is strong in its size category, and offers decent performances on writing-based tasks and internal knowledge, and very strong performance on translation tasks. The small size of the CroissantLLM model however hinders its capacity to perform more complex reasoning-based tasks, at least in a zero or few-shot manner in its generalist base or chat-model versions. This is aligned with other models of size and underlines the importance of scale for more abstract tasks.
#### Knowledge Cutoff
The model training dataset has a data cutoff date corresponding to the November 2023 Wikipedia dump. This is the de facto knowledge cutoff date for our base model, although a lot of information dates back further. Updated versions can be trained through continued pre-training or subsequent fine-tuning.
#### Multilingual performance.
CroissantLLM is mostly a French and English model. Code performance is relatively limited, and although some amount of data from other languages is included within the SlimPajama training set, out-of-the-box performance in other languages is not to be expected, although some European languages do work quite well.
#### Hallucinations.
CroissantLLM can hallucinate and output factually incorrect data, especially regarding complex topics. This is to be expected given the small model size, and hallucination rates seem inferior to most models of the same size category although no quantitative assessments have been conducted outside of MT-Bench experiments. | [
"TRANSLATION"
]
| [
"CRAFT"
]
| Non_BioNLP |
# CroissantLLMChat (190k steps + Chat)
This model is part of the CroissantLLM initiative, and corresponds to the checkpoint after 190k steps (2.99 T) tokens and a final Chat finetuning phase.
https://arxiv.org/abs/2402.00786
For best performance, it should be used with a temperature of 0.3 or more, and with the exact template described below:
```python
chat = [
{"role": "user", "content": "Que puis-je faire à Marseille en hiver?"},
]
chat_input = tokenizer.apply_chat_template(chat, tokenize=False, add_generation_prompt=True)
```
corresponding to:
```python
chat_input = """<|im_start|>user
{USER QUERY}<|im_end|>
<|im_start|>assistant\n"""
```
## Abstract
We introduce CroissantLLM, a 1.3B language model pretrained on a set of 3T English and French tokens, to bring to the research and industrial community a high-performance, fully open-sourced bilingual model that runs swiftly on consumer-grade local hardware.
To that end, we pioneer the approach of training an intrinsically bilingual model with a 1:1 English-to-French pretraining data ratio, a custom tokenizer, and bilingual finetuning datasets. We release the training dataset, notably containing a French split with manually curated, high-quality, and varied data sources.
To assess performance outside of English, we craft a novel benchmark, FrenchBench, consisting of an array of classification and generation tasks, covering various orthogonal aspects of model performance in the French Language. Additionally, rooted in transparency and to foster further Large Language Model research, we release codebases, and dozens of checkpoints across various model sizes, training data distributions, and training steps, as well as fine-tuned Chat models, and strong translation models. We evaluate our model through the FMTI framework, and validate 81% of the transparency criteria, far beyond the scores of even most open initiatives.
This work enriches the NLP landscape, breaking away from previous English-centric work in order to strengthen our understanding of multilinguality in language models.
## Citation
Our work can be cited as:
```bash
@misc{faysse2024croissantllm,
title={CroissantLLM: A Truly Bilingual French-English Language Model},
author={Manuel Faysse and Patrick Fernandes and Nuno M. Guerreiro and António Loison and Duarte M. Alves and Caio Corro and Nicolas Boizard and João Alves and Ricardo Rei and Pedro H. Martins and Antoni Bigata Casademunt and François Yvon and André F. T. Martins and Gautier Viaud and Céline Hudelot and Pierre Colombo},
year={2024},
eprint={2402.00786},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## Usage
This model is a Chat model, that is, it is finetuned for Chat function and works best with the provided template.
#### With generate
This might require a stopping criteria on <|im_end|> token.
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "croissantllm/CroissantLLMChat-v0.1"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)
generation_args = {
"max_new_tokens": 256,
"do_sample": True,
"temperature": 0.3,
"top_p": 0.90,
"top_k": 40,
"repetition_penalty": 1.05,
"eos_token_id": [tokenizer.eos_token_id, 32000],
}
chat = [
{"role": "user", "content": "Qui est le président francais actuel ?"},
]
chat_input = tokenizer.apply_chat_template(chat, tokenize=False, add_generation_prompt=True)
inputs = tokenizer(chat_input, return_tensors="pt").to(model.device)
tokens = model.generate(**inputs, **generation_args)
print(tokenizer.decode(tokens[0]))
# print tokens individually
print([(tokenizer.decode([tok]), tok) for tok in tokens[0].tolist()])
```
## Model limitations
Evaluation results indicate the model is strong in its size category, and offers decent performances on writing-based tasks and internal knowledge, and very strong performance on translation tasks. The small size of the CroissantLLM model however hinders its capacity to perform more complex reasoning-based tasks, at least in a zero or few-shot manner in its generalist base or chat-model versions. This is aligned with other models of size and underlines the importance of scale for more abstract tasks.
#### Knowledge Cutoff
The model training dataset has a data cutoff date corresponding to the November 2023 Wikipedia dump. This is the de facto knowledge cutoff date for our base model, although a lot of information dates back further. Updated versions can be trained through continued pre-training or subsequent fine-tuning.
#### Multilingual performance.
CroissantLLM is mostly a French and English model. Code performance is relatively limited, and although some amount of data from other languages is included within the SlimPajama training set, out-of-the-box performance in other languages is not to be expected, although some European languages do work quite well.
#### Hallucinations.
CroissantLLM can hallucinate and output factually incorrect data, especially regarding complex topics. This is to be expected given the small model size, and hallucination rates seem inferior to most models of the same size category although no quantitative assessments have been conducted outside of MT-Bench experiments. | {"datasets": ["croissantllm/croissant_dataset", "croissantllm/CroissantLLM-2201-sft", "cerebras/SlimPajama-627B", "uonlp/CulturaX", "pg19", "bigcode/starcoderdata"], "language": ["fr", "en"], "license": "mit", "pipeline_tag": "text-generation", "tags": ["legal", "code", "text-generation-inference", "art"]} |
Shashwat13333/msmarco-distilbert-base-v4 | Shashwat13333 | sentence-similarity | [
"sentence-transformers",
"safetensors",
"distilbert",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:150",
"loss:MatryoshkaLoss",
"loss:MultipleNegativesRankingLoss",
"en",
"arxiv:1908.10084",
"arxiv:2205.13147",
"arxiv:1705.00652",
"base_model:sentence-transformers/msmarco-distilbert-base-v4",
"base_model:finetune:sentence-transformers/msmarco-distilbert-base-v4",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
]
| 2025-02-03T09:32:05 | 2025-02-03T14:49:00 | 12 | 0 | ---
base_model: sentence-transformers/msmarco-distilbert-base-v4
language:
- en
library_name: sentence-transformers
license: apache-2.0
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:150
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: What services does Techchefz Digital offer for AI adoption?
sentences:
- 'How can we get started with your DevOps solutions?
Getting started is easy. Contact us through our website. We''ll schedule a consultation
to discuss your needs, evaluate your current infrastructure, and propose a customized
DevOps solution designed to achieve your goals.'
- "At Techchefz Digital, we specialize in guiding companies through the complexities\
\ of adopting and integrating Artificial Intelligence and Machine Learning technologies.\
\ Our consultancy services are designed to enhance your operational efficiency\
\ and decision-making capabilities across all sectors. With a global network of\
\ AI/ML experts and a commitment to excellence, we are your partners in transforming\
\ innovative possibilities into real-world achievements. \
\ \
\ \n DATA INTELLIGENCE PLATFORMS we\
\ specialize in\nTensorFlow\nDatabricks\nTableau\nPytorch\nOpenAI\nPinecone\""
- 'We are a New breed of innovative digital transformation agency, redefining storytelling
for an always-on world.
With roots dating back to 2017, we started as a pocket size team of enthusiasts
with a goal of helping traditional businesses transform and create dynamic, digital
cultures through disruptive strategies and agile deployment of innovative solutions.'
- source_sentence: Do you provide support 24/7?
sentences:
- 'How do we do Custom Development ?
We follow below process to develop custom web or mobile Application on Agile Methodology,
breaking requirements in pieces and developing and shipping them with considering
utmost quality:
Requirements Analysis
We begin by understanding the client's needs and objectives for the website.
Identify key features, functionality, and any specific design preferences.
Project Planning
Then create a detailed project plan outlining the scope, timeline, and milestones.
Define the technology stack and development tools suitable for the project.
User Experience Design
Then comes the stage of Developing wireframes or prototypes to visualize the website's
structure and layout. We create a custom design that aligns with the brand identity
and user experience goals.
Development
After getting Sign-off on Design from Client, we break the requirements into Sprints
on Agile Methodology, and start developing them.'
- 'This is our Portfolio
Introducing the world of Housing Finance& Banking Firm.
Corporate Website with 10 regional languages in India with analytics and user
personalization and Dashboard for Regional Managers, Sales Agents, etc. to manage
the Builder Requests, approve/deny Properties, manage visits and appointments,
manage leads, etc.
Introducing the world of Global Automotive Brand.We have implemented a Multi Locale
Multilingual Omnichannel platform for Royal Enfield. The platform supports public
websites, customer portals, internal portals, business applications for over 35+
different locations all over the world.
Developed Digital Platform for Students, Guardians, Teachers, Tutors, with AI/ML
in collaboration with Successive Technologies Inc, USA. Cloud, Dev-Sec-Ops &
Data Governance
Managing cloud provisioning and modernization alongside automated infrastructure,
event-driven microservices, containerization, DevOps, cybersecurity, and 24x7
monitoring support ensures efficient, secure, and responsive IT operations.'
- "SERVICES WE PROVIDE\nFlexible engagement models tailored to your needs\nWe specialize\
\ in comprehensive website audits that provide valuable insights and recommendations\
\ to enhance your online presence.\nDigital Strategy & Consulting\nCreating digital\
\ roadmap that transform your digital enterprise and produce a return on investment,\
\ basis our discovery framework, brainstorming sessions & current state analysis.\n\
\nPlatform Selection\nHelping you select the optimal digital experience, commerce,\
\ cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying\
\ next-gen scalable and agile enterprise digital platforms, along with multi-platform\
\ integrations. \nProduct Builds\nHelp you ideate, strategize, and engineer\
\ your product with help of our enterprise frameworks\nInfrastructure\nSpecialize\
\ in multi-cloud infrastructure helping you put forward the right cloud infrastructure\
\ and optimization strategy.\n\nManaged Services\nOperate and monitor your business-critical\
\ applications, data, and IT workloads, along with Application maintenance and\
\ operations.\nTeam Augmentation\nHelp you scale up and augment your existing\
\ team to solve your hiring challenges with our easy to deploy staff augmentation\
\ offerings.\""
- source_sentence: What challenges did the company face in its early days?
sentences:
- 'Why do we need Microservices ?
Instead of building a monolithic application where all functionalities are tightly
integrated, microservices break down the system into modular and loosely coupled
services.
Scalability
Flexibility and Agility
Resilience and Fault Isolation
Technology Diversity
Continuous Delivery'
- 'After a transformative scuba dive in the Maldives, Mayank Maggon made a pivotal
decision to depart from the corporate ladder in December 2016. Fueled by a clear
vision to revolutionize the digital landscape, Mayank set out to leverage the
best technology ingredients, crafting custom applications and digital ecosystems
tailored to clients'' specific needs, limitations, and budgets.
However, this solo journey was not without its challenges. Mayank had to initiate
the revenue engine by offering corporate trainings and conducting online batches
for tech training across the USA. He also undertook small projects and subcontracted
modules of larger projects for clients in the US, UK, and India. It was only after
this initial groundwork that Mayank was able to hire a group of interns, whom
he meticulously trained and groomed to prepare them for handling Enterprise Level
Applications. This journey reflects Mayank''s resilience, determination, and entrepreneurial
spirit in building TechChefz Digital from the ground up.
With a passion for innovation and a relentless drive for excellence, Mayank has
steered TechChefz Digital through strategic partnerships, groundbreaking projects,
and exponential growth. His leadership has been instrumental in shaping TechChefz
Digital into a leading force in the digital transformation arena, inspiring a
culture of innovation and excellence that continues to propel the company forward.'
- 'What makes your DevOps solutions stand out from the competition?
Our DevOps solutions stand out due to our personalized approach, extensive expertise,
and commitment to innovation. We focus on delivering measurable results, such
as reduced deployment times, improved system reliability, and enhanced security,
ensuring you get the maximum benefit from our services.'
- source_sentence: What kind of data do you leverage for AI solutions?
sentences:
- 'Our Solutions
Strategy & Digital Transformation
Innovate via digital transformation, modernize tech, craft product strategies,
enhance customer experiences, optimize data analytics, transition to cloud for
growth and efficiency
Product Engineering & Custom Development
Providing product development, enterprise web and mobile development, microservices
integrations, quality engineering, and application support services to drive innovation
and enhance operational efficiency.'
- 'In what ways can machine learning optimize our operations?
Machine learning algorithms can analyze operational data to identify inefficiencies,
predict maintenance needs, optimize supply chains, and automate repetitive tasks,
significantly improving operational efficiency and reducing costs.'
- Our AI/ML services pave the way for transformative change across industries, embodying
a client-focused approach that integrates seamlessly with human-centric innovation.
Our collaborative teams are dedicated to fostering growth, leveraging data, and
harnessing the predictive power of artificial intelligence to forge the next wave
of software excellence. We don't just deliver AI; we deliver the future.
- source_sentence: What managed services does TechChefz provide ?
sentences:
- " What we do\n\nDigital Strategy\nCreating digital frameworks that transform\
\ your digital enterprise and produce a return on investment.\n\nPlatform Selection\n\
Helping you select the optimal digital experience, commerce, cloud and marketing\
\ platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable\
\ and agile enterprise digital platforms, along with multi-platform integrations.\n\
\nProduct Builds\nHelp you ideate, strategize, and engineer your product with\
\ help of our enterprise frameworks \n\nTeam Augmentation\nHelp you scale up and\
\ augment your existing team to solve your hiring challenges with our easy to\
\ deploy staff augmentation offerings .\nManaged Services\nOperate and monitor\
\ your business-critical applications, data, and IT workloads, along with Application\
\ maintenance and operations\n"
- 'Introducing the world of General Insurance Firm
In this project, we implemented Digital Solution and Implementation with Headless
Drupal as the CMS, and lightweight React JS (Next JS SSR on Node JS) with the
following features:
PWA & AMP based Web Pages
Page Speed Optimization
Reusable and scalable React JS / Next JS Templates and Components
Headless Drupal CMS with Content & Experience management, approval workflows,
etc for seamless collaboration between the business and marketing teams
Minimalistic Buy and Renewal Journeys for various products, with API integrations
and adherence to data compliances
We achieved 250% Reduction in Operational Time and Effort in managing the Content
& Experience for Buy & renew Journeys,220% Reduction in Customer Drops during
buy and renewal journeys, 300% Reduction in bounce rate on policy landing and
campaign pages'
- 'In the Introducing the world of Global Insurance Firm, we crafted Effective Solutions
for Complex Problems and delieverd a comprehensive Website Development, Production
Support & Managed Services, we optimized customer journeys, integrate analytics,
CRM, ERP, and third-party applications, and implement cutting-edge technologies
for enhanced performance and efficiency
and achievied 200% Reduction in operational time & effort managing content & experience,
70% Reduction in Deployment Errors and Downtime, 2.5X Customer Engagement, Conversion
& Retention'
model-index:
- name: BGE base Financial Matryoshka
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 768
type: dim_768
metrics:
- type: cosine_accuracy@1
value: 0.10666666666666667
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.49333333333333335
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.5333333333333333
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.6266666666666667
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.10666666666666667
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.16444444444444445
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.10666666666666667
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.06266666666666666
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.10666666666666667
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.49333333333333335
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.5333333333333333
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.6266666666666667
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.3696947495406473
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.2864550264550264
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.2993424751990436
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 512
type: dim_512
metrics:
- type: cosine_accuracy@1
value: 0.10666666666666667
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.4666666666666667
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.5333333333333333
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.6133333333333333
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.10666666666666667
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.15555555555555556
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.10666666666666667
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.06133333333333333
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.10666666666666667
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.4666666666666667
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.5333333333333333
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.6133333333333333
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.3702942720383175
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.29092063492063486
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.3047495006876888
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 256
type: dim_256
metrics:
- type: cosine_accuracy@1
value: 0.14666666666666667
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.4533333333333333
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.49333333333333335
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.6
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.14666666666666667
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.1511111111111111
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.09866666666666667
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.06
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.14666666666666667
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.4533333333333333
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.49333333333333335
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.6
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.37318151343456746
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.3006455026455026
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.31352550381063704
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 128
type: dim_128
metrics:
- type: cosine_accuracy@1
value: 0.12
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.4533333333333333
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.49333333333333335
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.6
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.12
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.1511111111111111
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.09866666666666667
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.06
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.12
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.4533333333333333
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.49333333333333335
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.6
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.349467831727335
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.26956613756613756
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.2814743968696581
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 64
type: dim_64
metrics:
- type: cosine_accuracy@1
value: 0.16
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.38666666666666666
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.4666666666666667
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.5466666666666666
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.16
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.1288888888888889
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.09333333333333335
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.05466666666666666
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.16
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.38666666666666666
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.4666666666666667
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.5466666666666666
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.34485137335598726
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.28099999999999997
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.29532589563098727
name: Cosine Map@100
---
# BGE base Financial Matryoshka
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/msmarco-distilbert-base-v4](https://huggingface.co/sentence-transformers/msmarco-distilbert-base-v4). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [sentence-transformers/msmarco-distilbert-base-v4](https://huggingface.co/sentence-transformers/msmarco-distilbert-base-v4) <!-- at revision 19f0f4c73dc418bad0e0fc600611e808b7448a28 -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DistilBertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Shashwat13333/msmarco-distilbert-base-v4")
# Run inference
sentences = [
'What managed services does TechChefz provide ?',
' What we do\n\nDigital Strategy\nCreating digital frameworks that transform your digital enterprise and produce a return on investment.\n\nPlatform Selection\nHelping you select the optimal digital experience, commerce, cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable and agile enterprise digital platforms, along with multi-platform integrations.\n\nProduct Builds\nHelp you ideate, strategize, and engineer your product with help of our enterprise frameworks \n\nTeam Augmentation\nHelp you scale up and augment your existing team to solve your hiring challenges with our easy to deploy staff augmentation offerings .\nManaged Services\nOperate and monitor your business-critical applications, data, and IT workloads, along with Application maintenance and operations\n',
'In the Introducing the world of Global Insurance Firm, we crafted Effective Solutions for Complex Problems and delieverd a comprehensive Website Development, Production Support & Managed Services, we optimized customer journeys, integrate analytics, CRM, ERP, and third-party applications, and implement cutting-edge technologies for enhanced performance and efficiency\nand achievied 200% Reduction in operational time & effort managing content & experience, 70% Reduction in Deployment Errors and Downtime, 2.5X Customer Engagement, Conversion & Retention',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Information Retrieval
* Datasets: `dim_768`, `dim_512`, `dim_256`, `dim_128` and `dim_64`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | dim_768 | dim_512 | dim_256 | dim_128 | dim_64 |
|:--------------------|:-----------|:-----------|:-----------|:-----------|:-----------|
| cosine_accuracy@1 | 0.1067 | 0.1067 | 0.1467 | 0.12 | 0.16 |
| cosine_accuracy@3 | 0.4933 | 0.4667 | 0.4533 | 0.4533 | 0.3867 |
| cosine_accuracy@5 | 0.5333 | 0.5333 | 0.4933 | 0.4933 | 0.4667 |
| cosine_accuracy@10 | 0.6267 | 0.6133 | 0.6 | 0.6 | 0.5467 |
| cosine_precision@1 | 0.1067 | 0.1067 | 0.1467 | 0.12 | 0.16 |
| cosine_precision@3 | 0.1644 | 0.1556 | 0.1511 | 0.1511 | 0.1289 |
| cosine_precision@5 | 0.1067 | 0.1067 | 0.0987 | 0.0987 | 0.0933 |
| cosine_precision@10 | 0.0627 | 0.0613 | 0.06 | 0.06 | 0.0547 |
| cosine_recall@1 | 0.1067 | 0.1067 | 0.1467 | 0.12 | 0.16 |
| cosine_recall@3 | 0.4933 | 0.4667 | 0.4533 | 0.4533 | 0.3867 |
| cosine_recall@5 | 0.5333 | 0.5333 | 0.4933 | 0.4933 | 0.4667 |
| cosine_recall@10 | 0.6267 | 0.6133 | 0.6 | 0.6 | 0.5467 |
| **cosine_ndcg@10** | **0.3697** | **0.3703** | **0.3732** | **0.3495** | **0.3449** |
| cosine_mrr@10 | 0.2865 | 0.2909 | 0.3006 | 0.2696 | 0.281 |
| cosine_map@100 | 0.2993 | 0.3047 | 0.3135 | 0.2815 | 0.2953 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 150 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 150 samples:
| | anchor | positive |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 7 tokens</li><li>mean: 12.45 tokens</li><li>max: 18 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 126.17 tokens</li><li>max: 378 tokens</li></ul> |
* Samples:
| anchor | positive |
|:----------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>How can digital transformation enhance customer interactions across multiple channels?</code> | <code>We offer custom software development, digital marketing strategies, and tailored solutions to drive tangible results for your business. Our expert team combines technical prowess with industry insights to propel your business forward in the digital landscape.<br><br>"Engage, analyze & target your customers<br>Digital transformation enables you to interact with customers across multiple channels, providing personalized experiences. This could include social media engagement, interactive websites, and mobile apps." "Empower your employees & partners<br>The push for digital transformation has led many companies to embrace cloud solutions. However, the migration and integration of legacy systems into the cloud often present challenges." "Optimize & automate your operations<br>The push for digital transformation has led many companies to embrace cloud solutions. However, the migration and integration of legacy systems into the cloud often present challenges." "Transform your products<br>The push for digi...</code> |
| <code>How does a CRM system improve customer retention?</code> | <code>Our MarTech capabilities<br><br>Personalization<br>Involves tailoring marketing messages and experiences to individual customers. It enhances customer engagement, loyalty, and ultimately, conversion rates.<br><br>Marketing Automation<br>Marketing automation streamlines repetitive tasks such as email marketing, lead nurturing, and social media posting. It improves efficiency, saves time, and ensures timely communication with customers.<br><br>Customer Relationship Management<br>CRM systems help manage interactions with current and potential customers. They store customer data, track interactions, and facilitate communication, improving customer retention.</code> |
| <code>How can your recommendation engines improve our business?</code> | <code>How can your recommendation engines improve our business?<br>Our recommendation engines are designed to analyze customer behavior and preferences to deliver personalized suggestions, enhancing user experience, increasing sales, and boosting customer retention.</code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `gradient_accumulation_steps`: 4
- `learning_rate`: 1e-05
- `weight_decay`: 0.01
- `num_train_epochs`: 4
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `fp16`: True
- `load_best_model_at_end`: True
- `optim`: adamw_torch_fused
- `push_to_hub`: True
- `hub_model_id`: Shashwat13333/msmarco-distilbert-base-v4
- `push_to_hub_model_id`: msmarco-distilbert-base-v4
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: epoch
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 8
- `per_device_eval_batch_size`: 8
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 4
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 1e-05
- `weight_decay`: 0.01
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 4
- `max_steps`: -1
- `lr_scheduler_type`: cosine
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: True
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch_fused
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: True
- `resume_from_checkpoint`: None
- `hub_model_id`: Shashwat13333/msmarco-distilbert-base-v4
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: msmarco-distilbert-base-v4
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | dim_768_cosine_ndcg@10 | dim_512_cosine_ndcg@10 | dim_256_cosine_ndcg@10 | dim_128_cosine_ndcg@10 | dim_64_cosine_ndcg@10 |
|:----------:|:-----:|:-------------:|:----------------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|
| 0.2105 | 1 | 3.5757 | - | - | - | - | - |
| **0.8421** | **4** | **-** | **0.3563** | **0.3543** | **0.3378** | **0.3681** | **0.3077** |
| 1.2105 | 5 | 4.4031 | - | - | - | - | - |
| 1.8421 | 8 | - | 0.3652 | 0.3547 | 0.3574 | 0.3542 | 0.3579 |
| 2.4211 | 10 | 3.3423 | - | - | - | - | - |
| 2.8421 | 12 | - | 0.3783 | 0.3680 | 0.3558 | 0.3807 | 0.3408 |
| 3.6316 | 15 | 2.3695 | - | - | - | - | - |
| 3.8421 | 16 | - | 0.3697 | 0.3703 | 0.3732 | 0.3495 | 0.3449 |
* The bold row denotes the saved checkpoint.
### Framework Versions
- Python: 3.11.11
- Sentence Transformers: 3.3.1
- Transformers: 4.47.1
- PyTorch: 2.5.1+cu124
- Accelerate: 1.2.1
- Datasets: 3.2.0
- Tokenizers: 0.21.0
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MatryoshkaLoss
```bibtex
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | [
"TEXT_CLASSIFICATION"
]
| [
"CRAFT"
]
| Non_BioNLP |
# BGE base Financial Matryoshka
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/msmarco-distilbert-base-v4](https://huggingface.co/sentence-transformers/msmarco-distilbert-base-v4). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [sentence-transformers/msmarco-distilbert-base-v4](https://huggingface.co/sentence-transformers/msmarco-distilbert-base-v4) <!-- at revision 19f0f4c73dc418bad0e0fc600611e808b7448a28 -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DistilBertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Shashwat13333/msmarco-distilbert-base-v4")
# Run inference
sentences = [
'What managed services does TechChefz provide ?',
' What we do\n\nDigital Strategy\nCreating digital frameworks that transform your digital enterprise and produce a return on investment.\n\nPlatform Selection\nHelping you select the optimal digital experience, commerce, cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable and agile enterprise digital platforms, along with multi-platform integrations.\n\nProduct Builds\nHelp you ideate, strategize, and engineer your product with help of our enterprise frameworks \n\nTeam Augmentation\nHelp you scale up and augment your existing team to solve your hiring challenges with our easy to deploy staff augmentation offerings .\nManaged Services\nOperate and monitor your business-critical applications, data, and IT workloads, along with Application maintenance and operations\n',
'In the Introducing the world of Global Insurance Firm, we crafted Effective Solutions for Complex Problems and delieverd a comprehensive Website Development, Production Support & Managed Services, we optimized customer journeys, integrate analytics, CRM, ERP, and third-party applications, and implement cutting-edge technologies for enhanced performance and efficiency\nand achievied 200% Reduction in operational time & effort managing content & experience, 70% Reduction in Deployment Errors and Downtime, 2.5X Customer Engagement, Conversion & Retention',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Information Retrieval
* Datasets: `dim_768`, `dim_512`, `dim_256`, `dim_128` and `dim_64`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | dim_768 | dim_512 | dim_256 | dim_128 | dim_64 |
|:--------------------|:-----------|:-----------|:-----------|:-----------|:-----------|
| cosine_accuracy@1 | 0.1067 | 0.1067 | 0.1467 | 0.12 | 0.16 |
| cosine_accuracy@3 | 0.4933 | 0.4667 | 0.4533 | 0.4533 | 0.3867 |
| cosine_accuracy@5 | 0.5333 | 0.5333 | 0.4933 | 0.4933 | 0.4667 |
| cosine_accuracy@10 | 0.6267 | 0.6133 | 0.6 | 0.6 | 0.5467 |
| cosine_precision@1 | 0.1067 | 0.1067 | 0.1467 | 0.12 | 0.16 |
| cosine_precision@3 | 0.1644 | 0.1556 | 0.1511 | 0.1511 | 0.1289 |
| cosine_precision@5 | 0.1067 | 0.1067 | 0.0987 | 0.0987 | 0.0933 |
| cosine_precision@10 | 0.0627 | 0.0613 | 0.06 | 0.06 | 0.0547 |
| cosine_recall@1 | 0.1067 | 0.1067 | 0.1467 | 0.12 | 0.16 |
| cosine_recall@3 | 0.4933 | 0.4667 | 0.4533 | 0.4533 | 0.3867 |
| cosine_recall@5 | 0.5333 | 0.5333 | 0.4933 | 0.4933 | 0.4667 |
| cosine_recall@10 | 0.6267 | 0.6133 | 0.6 | 0.6 | 0.5467 |
| **cosine_ndcg@10** | **0.3697** | **0.3703** | **0.3732** | **0.3495** | **0.3449** |
| cosine_mrr@10 | 0.2865 | 0.2909 | 0.3006 | 0.2696 | 0.281 |
| cosine_map@100 | 0.2993 | 0.3047 | 0.3135 | 0.2815 | 0.2953 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 150 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 150 samples:
| | anchor | positive |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 7 tokens</li><li>mean: 12.45 tokens</li><li>max: 18 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 126.17 tokens</li><li>max: 378 tokens</li></ul> |
* Samples:
| anchor | positive |
|:----------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>How can digital transformation enhance customer interactions across multiple channels?</code> | <code>We offer custom software development, digital marketing strategies, and tailored solutions to drive tangible results for your business. Our expert team combines technical prowess with industry insights to propel your business forward in the digital landscape.<br><br>"Engage, analyze & target your customers<br>Digital transformation enables you to interact with customers across multiple channels, providing personalized experiences. This could include social media engagement, interactive websites, and mobile apps." "Empower your employees & partners<br>The push for digital transformation has led many companies to embrace cloud solutions. However, the migration and integration of legacy systems into the cloud often present challenges." "Optimize & automate your operations<br>The push for digital transformation has led many companies to embrace cloud solutions. However, the migration and integration of legacy systems into the cloud often present challenges." "Transform your products<br>The push for digi...</code> |
| <code>How does a CRM system improve customer retention?</code> | <code>Our MarTech capabilities<br><br>Personalization<br>Involves tailoring marketing messages and experiences to individual customers. It enhances customer engagement, loyalty, and ultimately, conversion rates.<br><br>Marketing Automation<br>Marketing automation streamlines repetitive tasks such as email marketing, lead nurturing, and social media posting. It improves efficiency, saves time, and ensures timely communication with customers.<br><br>Customer Relationship Management<br>CRM systems help manage interactions with current and potential customers. They store customer data, track interactions, and facilitate communication, improving customer retention.</code> |
| <code>How can your recommendation engines improve our business?</code> | <code>How can your recommendation engines improve our business?<br>Our recommendation engines are designed to analyze customer behavior and preferences to deliver personalized suggestions, enhancing user experience, increasing sales, and boosting customer retention.</code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `gradient_accumulation_steps`: 4
- `learning_rate`: 1e-05
- `weight_decay`: 0.01
- `num_train_epochs`: 4
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `fp16`: True
- `load_best_model_at_end`: True
- `optim`: adamw_torch_fused
- `push_to_hub`: True
- `hub_model_id`: Shashwat13333/msmarco-distilbert-base-v4
- `push_to_hub_model_id`: msmarco-distilbert-base-v4
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: epoch
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 8
- `per_device_eval_batch_size`: 8
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 4
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 1e-05
- `weight_decay`: 0.01
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 4
- `max_steps`: -1
- `lr_scheduler_type`: cosine
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: True
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch_fused
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: True
- `resume_from_checkpoint`: None
- `hub_model_id`: Shashwat13333/msmarco-distilbert-base-v4
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: msmarco-distilbert-base-v4
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | dim_768_cosine_ndcg@10 | dim_512_cosine_ndcg@10 | dim_256_cosine_ndcg@10 | dim_128_cosine_ndcg@10 | dim_64_cosine_ndcg@10 |
|:----------:|:-----:|:-------------:|:----------------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|
| 0.2105 | 1 | 3.5757 | - | - | - | - | - |
| **0.8421** | **4** | **-** | **0.3563** | **0.3543** | **0.3378** | **0.3681** | **0.3077** |
| 1.2105 | 5 | 4.4031 | - | - | - | - | - |
| 1.8421 | 8 | - | 0.3652 | 0.3547 | 0.3574 | 0.3542 | 0.3579 |
| 2.4211 | 10 | 3.3423 | - | - | - | - | - |
| 2.8421 | 12 | - | 0.3783 | 0.3680 | 0.3558 | 0.3807 | 0.3408 |
| 3.6316 | 15 | 2.3695 | - | - | - | - | - |
| 3.8421 | 16 | - | 0.3697 | 0.3703 | 0.3732 | 0.3495 | 0.3449 |
* The bold row denotes the saved checkpoint.
### Framework Versions
- Python: 3.11.11
- Sentence Transformers: 3.3.1
- Transformers: 4.47.1
- PyTorch: 2.5.1+cu124
- Accelerate: 1.2.1
- Datasets: 3.2.0
- Tokenizers: 0.21.0
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MatryoshkaLoss
```bibtex
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | {"base_model": "sentence-transformers/msmarco-distilbert-base-v4", "language": ["en"], "library_name": "sentence-transformers", "license": "apache-2.0", "metrics": ["cosine_accuracy@1", "cosine_accuracy@3", "cosine_accuracy@5", "cosine_accuracy@10", "cosine_precision@1", "cosine_precision@3", "cosine_precision@5", "cosine_precision@10", "cosine_recall@1", "cosine_recall@3", "cosine_recall@5", "cosine_recall@10", "cosine_ndcg@10", "cosine_mrr@10", "cosine_map@100"], "pipeline_tag": "sentence-similarity", "tags": ["sentence-transformers", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:150", "loss:MatryoshkaLoss", "loss:MultipleNegativesRankingLoss"], "widget": [{"source_sentence": "What services does Techchefz Digital offer for AI adoption?", "sentences": ["How can we get started with your DevOps solutions?\nGetting started is easy. Contact us through our website. We'll schedule a consultation to discuss your needs, evaluate your current infrastructure, and propose a customized DevOps solution designed to achieve your goals.", "At Techchefz Digital, we specialize in guiding companies through the complexities of adopting and integrating Artificial Intelligence and Machine Learning technologies. Our consultancy services are designed to enhance your operational efficiency and decision-making capabilities across all sectors. With a global network of AI/ML experts and a commitment to excellence, we are your partners in transforming innovative possibilities into real-world achievements. \n DATA INTELLIGENCE PLATFORMS we specialize in\nTensorFlow\nDatabricks\nTableau\nPytorch\nOpenAI\nPinecone\"", "We are a New breed of innovative digital transformation agency, redefining storytelling for an always-on world.\nWith roots dating back to 2017, we started as a pocket size team of enthusiasts with a goal of helping traditional businesses transform and create dynamic, digital cultures through disruptive strategies and agile deployment of innovative solutions."]}, {"source_sentence": "Do you provide support 24/7?", "sentences": ["How do we do Custom Development ?\nWe follow below process to develop custom web or mobile Application on Agile Methodology, breaking requirements in pieces and developing and shipping them with considering utmost quality:\nRequirements Analysis\nWe begin by understanding the client's needs and objectives for the website. Identify key features, functionality, and any specific design preferences.\n\nProject Planning\nThen create a detailed project plan outlining the scope, timeline, and milestones. Define the technology stack and development tools suitable for the project.\n\nUser Experience Design\nThen comes the stage of Developing wireframes or prototypes to visualize the website's structure and layout. We create a custom design that aligns with the brand identity and user experience goals.\n\nDevelopment\nAfter getting Sign-off on Design from Client, we break the requirements into Sprints on Agile Methodology, and start developing them.", "This is our Portfolio\nIntroducing the world of Housing Finance& Banking Firm.\nCorporate Website with 10 regional languages in India with analytics and user personalization and Dashboard for Regional Managers, Sales Agents, etc. to manage the Builder Requests, approve/deny Properties, manage visits and appointments, manage leads, etc.\n\n\nIntroducing the world of Global Automotive Brand.We have implemented a Multi Locale Multilingual Omnichannel platform for Royal Enfield. The platform supports public websites, customer portals, internal portals, business applications for over 35+ different locations all over the world.\n\nDeveloped Digital Platform for Students, Guardians, Teachers, Tutors, with AI/ML in collaboration with Successive Technologies Inc, USA. Cloud, Dev-Sec-Ops & Data Governance\nManaging cloud provisioning and modernization alongside automated infrastructure, event-driven microservices, containerization, DevOps, cybersecurity, and 24x7 monitoring support ensures efficient, secure, and responsive IT operations.", "SERVICES WE PROVIDE\nFlexible engagement models tailored to your needs\nWe specialize in comprehensive website audits that provide valuable insights and recommendations to enhance your online presence.\nDigital Strategy & Consulting\nCreating digital roadmap that transform your digital enterprise and produce a return on investment, basis our discovery framework, brainstorming sessions & current state analysis.\n\nPlatform Selection\nHelping you select the optimal digital experience, commerce, cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable and agile enterprise digital platforms, along with multi-platform integrations. \nProduct Builds\nHelp you ideate, strategize, and engineer your product with help of our enterprise frameworks\nInfrastructure\nSpecialize in multi-cloud infrastructure helping you put forward the right cloud infrastructure and optimization strategy.\n\nManaged Services\nOperate and monitor your business-critical applications, data, and IT workloads, along with Application maintenance and operations.\nTeam Augmentation\nHelp you scale up and augment your existing team to solve your hiring challenges with our easy to deploy staff augmentation offerings.\""]}, {"source_sentence": "What challenges did the company face in its early days?", "sentences": ["Why do we need Microservices ?\nInstead of building a monolithic application where all functionalities are tightly integrated, microservices break down the system into modular and loosely coupled services.\n\nScalability\nFlexibility and Agility\nResilience and Fault Isolation\nTechnology Diversity\nContinuous Delivery", "After a transformative scuba dive in the Maldives, Mayank Maggon made a pivotal decision to depart from the corporate ladder in December 2016. Fueled by a clear vision to revolutionize the digital landscape, Mayank set out to leverage the best technology ingredients, crafting custom applications and digital ecosystems tailored to clients' specific needs, limitations, and budgets.\n\nHowever, this solo journey was not without its challenges. Mayank had to initiate the revenue engine by offering corporate trainings and conducting online batches for tech training across the USA. He also undertook small projects and subcontracted modules of larger projects for clients in the US, UK, and India. It was only after this initial groundwork that Mayank was able to hire a group of interns, whom he meticulously trained and groomed to prepare them for handling Enterprise Level Applications. This journey reflects Mayank's resilience, determination, and entrepreneurial spirit in building TechChefz Digital from the ground up.\n\nWith a passion for innovation and a relentless drive for excellence, Mayank has steered TechChefz Digital through strategic partnerships, groundbreaking projects, and exponential growth. His leadership has been instrumental in shaping TechChefz Digital into a leading force in the digital transformation arena, inspiring a culture of innovation and excellence that continues to propel the company forward.", "What makes your DevOps solutions stand out from the competition?\nOur DevOps solutions stand out due to our personalized approach, extensive expertise, and commitment to innovation. We focus on delivering measurable results, such as reduced deployment times, improved system reliability, and enhanced security, ensuring you get the maximum benefit from our services."]}, {"source_sentence": "What kind of data do you leverage for AI solutions?", "sentences": ["Our Solutions\nStrategy & Digital Transformation\nInnovate via digital transformation, modernize tech, craft product strategies, enhance customer experiences, optimize data analytics, transition to cloud for growth and efficiency\n\nProduct Engineering & Custom Development\nProviding product development, enterprise web and mobile development, microservices integrations, quality engineering, and application support services to drive innovation and enhance operational efficiency.", "In what ways can machine learning optimize our operations?\nMachine learning algorithms can analyze operational data to identify inefficiencies, predict maintenance needs, optimize supply chains, and automate repetitive tasks, significantly improving operational efficiency and reducing costs.", "Our AI/ML services pave the way for transformative change across industries, embodying a client-focused approach that integrates seamlessly with human-centric innovation. Our collaborative teams are dedicated to fostering growth, leveraging data, and harnessing the predictive power of artificial intelligence to forge the next wave of software excellence. We don't just deliver AI; we deliver the future."]}, {"source_sentence": "What managed services does TechChefz provide ?", "sentences": [" What we do\n\nDigital Strategy\nCreating digital frameworks that transform your digital enterprise and produce a return on investment.\n\nPlatform Selection\nHelping you select the optimal digital experience, commerce, cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable and agile enterprise digital platforms, along with multi-platform integrations.\n\nProduct Builds\nHelp you ideate, strategize, and engineer your product with help of our enterprise frameworks \n\nTeam Augmentation\nHelp you scale up and augment your existing team to solve your hiring challenges with our easy to deploy staff augmentation offerings .\nManaged Services\nOperate and monitor your business-critical applications, data, and IT workloads, along with Application maintenance and operations\n", "Introducing the world of General Insurance Firm\nIn this project, we implemented Digital Solution and Implementation with Headless Drupal as the CMS, and lightweight React JS (Next JS SSR on Node JS) with the following features:\nPWA & AMP based Web Pages\nPage Speed Optimization\nReusable and scalable React JS / Next JS Templates and Components\nHeadless Drupal CMS with Content & Experience management, approval workflows, etc for seamless collaboration between the business and marketing teams\nMinimalistic Buy and Renewal Journeys for various products, with API integrations and adherence to data compliances\n\nWe achieved 250% Reduction in Operational Time and Effort in managing the Content & Experience for Buy & renew Journeys,220% Reduction in Customer Drops during buy and renewal journeys, 300% Reduction in bounce rate on policy landing and campaign pages", "In the Introducing the world of Global Insurance Firm, we crafted Effective Solutions for Complex Problems and delieverd a comprehensive Website Development, Production Support & Managed Services, we optimized customer journeys, integrate analytics, CRM, ERP, and third-party applications, and implement cutting-edge technologies for enhanced performance and efficiency\nand achievied 200% Reduction in operational time & effort managing content & experience, 70% Reduction in Deployment Errors and Downtime, 2.5X Customer Engagement, Conversion & Retention"]}], "model-index": [{"name": "BGE base Financial Matryoshka", "results": [{"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "dim 768", "type": "dim_768"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.10666666666666667, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.49333333333333335, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.5333333333333333, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.6266666666666667, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.10666666666666667, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.16444444444444445, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.10666666666666667, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.06266666666666666, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.10666666666666667, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.49333333333333335, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.5333333333333333, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.6266666666666667, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.3696947495406473, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.2864550264550264, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.2993424751990436, "name": "Cosine Map@100"}]}, {"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "dim 512", "type": "dim_512"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.10666666666666667, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.4666666666666667, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.5333333333333333, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.6133333333333333, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.10666666666666667, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.15555555555555556, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.10666666666666667, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.06133333333333333, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.10666666666666667, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.4666666666666667, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.5333333333333333, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.6133333333333333, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.3702942720383175, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.29092063492063486, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.3047495006876888, "name": "Cosine Map@100"}]}, {"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "dim 256", "type": "dim_256"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.14666666666666667, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.4533333333333333, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.49333333333333335, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.6, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.14666666666666667, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.1511111111111111, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.09866666666666667, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.06, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.14666666666666667, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.4533333333333333, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.49333333333333335, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.6, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.37318151343456746, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.3006455026455026, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.31352550381063704, "name": "Cosine Map@100"}]}, {"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "dim 128", "type": "dim_128"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.12, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.4533333333333333, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.49333333333333335, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.6, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.12, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.1511111111111111, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.09866666666666667, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.06, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.12, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.4533333333333333, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.49333333333333335, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.6, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.349467831727335, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.26956613756613756, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.2814743968696581, "name": "Cosine Map@100"}]}, {"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "dim 64", "type": "dim_64"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.16, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.38666666666666666, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.4666666666666667, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.5466666666666666, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.16, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.1288888888888889, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.09333333333333335, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.05466666666666666, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.16, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.38666666666666666, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.4666666666666667, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.5466666666666666, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.34485137335598726, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.28099999999999997, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.29532589563098727, "name": "Cosine Map@100"}]}]}]} |
linhphanff/stella_en_1.5B_v5_clone | linhphanff | sentence-similarity | [
"sentence-transformers",
"pytorch",
"safetensors",
"qwen2",
"text-generation",
"mteb",
"transformers",
"sentence-similarity",
"custom_code",
"arxiv:2205.13147",
"license:mit",
"model-index",
"autotrain_compatible",
"text-generation-inference",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
]
| 2024-09-25T02:13:25 | 2024-09-27T02:04:02 | 15 | 0 | ---
license: mit
tags:
- mteb
- sentence-transformers
- transformers
- sentence-similarity
model-index:
- name: stella_en_1.5B_v5
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 92.86567164179104
- type: ap
value: 72.13503907102613
- type: ap_weighted
value: 72.13503907102613
- type: f1
value: 89.5586886376355
- type: f1_weighted
value: 93.13621183004571
- type: main_score
value: 92.86567164179104
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 97.16485
- type: ap
value: 96.05546315415225
- type: ap_weighted
value: 96.05546315415225
- type: f1
value: 97.16351087403213
- type: f1_weighted
value: 97.16351087403213
- type: main_score
value: 97.16485
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 59.358
- type: f1
value: 59.0264615883114
- type: f1_weighted
value: 59.0264615883114
- type: main_score
value: 59.358
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: mteb/arguana
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: main_score
value: 65.269
- type: map_at_1
value: 41.607
- type: map_at_10
value: 57.104
- type: map_at_100
value: 57.621
- type: map_at_1000
value: 57.621
- type: map_at_20
value: 57.533
- type: map_at_3
value: 52.891999999999996
- type: map_at_5
value: 55.371
- type: mrr_at_1
value: 42.318634423897585
- type: mrr_at_10
value: 57.353970511865406
- type: mrr_at_100
value: 57.88398078476526
- type: mrr_at_1000
value: 57.88467807648422
- type: mrr_at_20
value: 57.796730533206166
- type: mrr_at_3
value: 53.200568990042775
- type: mrr_at_5
value: 55.6330014224753
- type: nauc_map_at_1000_diff1
value: 24.54414600428287
- type: nauc_map_at_1000_max
value: -8.389738078358459
- type: nauc_map_at_1000_std
value: -18.188787645801366
- type: nauc_map_at_100_diff1
value: 24.543138576462308
- type: nauc_map_at_100_max
value: -8.390896839752044
- type: nauc_map_at_100_std
value: -18.192549240185247
- type: nauc_map_at_10_diff1
value: 24.219607088995822
- type: nauc_map_at_10_max
value: -8.245734391254308
- type: nauc_map_at_10_std
value: -18.229706566466447
- type: nauc_map_at_1_diff1
value: 29.325201664812788
- type: nauc_map_at_1_max
value: -11.742800494823971
- type: nauc_map_at_1_std
value: -18.610215769702528
- type: nauc_map_at_20_diff1
value: 24.471097562798803
- type: nauc_map_at_20_max
value: -8.318035874000799
- type: nauc_map_at_20_std
value: -18.171541096773108
- type: nauc_map_at_3_diff1
value: 24.275846107642824
- type: nauc_map_at_3_max
value: -8.212242049581894
- type: nauc_map_at_3_std
value: -17.920379368937496
- type: nauc_map_at_5_diff1
value: 23.873692493209255
- type: nauc_map_at_5_max
value: -8.110347163828767
- type: nauc_map_at_5_std
value: -18.20863325596931
- type: nauc_mrr_at_1000_diff1
value: 22.656410956419975
- type: nauc_mrr_at_1000_max
value: -8.924888102233243
- type: nauc_mrr_at_1000_std
value: -18.103674384502526
- type: nauc_mrr_at_100_diff1
value: 22.655448817140968
- type: nauc_mrr_at_100_max
value: -8.926034318499038
- type: nauc_mrr_at_100_std
value: -18.10743930104164
- type: nauc_mrr_at_10_diff1
value: 22.297536272996872
- type: nauc_mrr_at_10_max
value: -8.836407556658274
- type: nauc_mrr_at_10_std
value: -18.1598393044477
- type: nauc_mrr_at_1_diff1
value: 27.419572424489708
- type: nauc_mrr_at_1_max
value: -11.42241314820691
- type: nauc_mrr_at_1_std
value: -18.54893865856313
- type: nauc_mrr_at_20_diff1
value: 22.590227214657418
- type: nauc_mrr_at_20_max
value: -8.849986456376993
- type: nauc_mrr_at_20_std
value: -18.0862391777352
- type: nauc_mrr_at_3_diff1
value: 22.415270167774988
- type: nauc_mrr_at_3_max
value: -8.692871854156435
- type: nauc_mrr_at_3_std
value: -17.6740102891955
- type: nauc_mrr_at_5_diff1
value: 21.96284578521464
- type: nauc_mrr_at_5_max
value: -8.757031535546025
- type: nauc_mrr_at_5_std
value: -18.210766964081294
- type: nauc_ndcg_at_1000_diff1
value: 23.939400161569115
- type: nauc_ndcg_at_1000_max
value: -7.866999120512983
- type: nauc_ndcg_at_1000_std
value: -17.981457019643617
- type: nauc_ndcg_at_100_diff1
value: 23.920033349619317
- type: nauc_ndcg_at_100_max
value: -7.889849409678031
- type: nauc_ndcg_at_100_std
value: -18.054931990360537
- type: nauc_ndcg_at_10_diff1
value: 22.543020461303534
- type: nauc_ndcg_at_10_max
value: -7.072111788010867
- type: nauc_ndcg_at_10_std
value: -18.26397604573537
- type: nauc_ndcg_at_1_diff1
value: 29.325201664812788
- type: nauc_ndcg_at_1_max
value: -11.742800494823971
- type: nauc_ndcg_at_1_std
value: -18.610215769702528
- type: nauc_ndcg_at_20_diff1
value: 23.551587021207972
- type: nauc_ndcg_at_20_max
value: -7.298056222649139
- type: nauc_ndcg_at_20_std
value: -18.056004880930608
- type: nauc_ndcg_at_3_diff1
value: 22.669089506345273
- type: nauc_ndcg_at_3_max
value: -7.278024373570137
- type: nauc_ndcg_at_3_std
value: -17.816657759914193
- type: nauc_ndcg_at_5_diff1
value: 21.72619728226575
- type: nauc_ndcg_at_5_max
value: -6.959741647471228
- type: nauc_ndcg_at_5_std
value: -18.35173705190235
- type: nauc_precision_at_1000_diff1
value: 5.0388241058076995
- type: nauc_precision_at_1000_max
value: 34.439879624882145
- type: nauc_precision_at_1000_std
value: 77.22610895194498
- type: nauc_precision_at_100_diff1
value: 1.340670767252794
- type: nauc_precision_at_100_max
value: 19.30870025961241
- type: nauc_precision_at_100_std
value: 35.37688289157788
- type: nauc_precision_at_10_diff1
value: 7.734227153124332
- type: nauc_precision_at_10_max
value: 4.202399088422237
- type: nauc_precision_at_10_std
value: -18.383890254046698
- type: nauc_precision_at_1_diff1
value: 29.325201664812788
- type: nauc_precision_at_1_max
value: -11.742800494823971
- type: nauc_precision_at_1_std
value: -18.610215769702528
- type: nauc_precision_at_20_diff1
value: 9.48070999361637
- type: nauc_precision_at_20_max
value: 19.056709637253025
- type: nauc_precision_at_20_std
value: -13.266821166159485
- type: nauc_precision_at_3_diff1
value: 17.245260303409747
- type: nauc_precision_at_3_max
value: -4.202455033452335
- type: nauc_precision_at_3_std
value: -17.514264039955332
- type: nauc_precision_at_5_diff1
value: 12.074628162049974
- type: nauc_precision_at_5_max
value: -1.9145501461107832
- type: nauc_precision_at_5_std
value: -19.162525528916344
- type: nauc_recall_at_1000_diff1
value: 5.038824105805915
- type: nauc_recall_at_1000_max
value: 34.43987962487738
- type: nauc_recall_at_1000_std
value: 77.22610895193765
- type: nauc_recall_at_100_diff1
value: 1.3406707672497025
- type: nauc_recall_at_100_max
value: 19.30870025960776
- type: nauc_recall_at_100_std
value: 35.37688289157515
- type: nauc_recall_at_10_diff1
value: 7.734227153124366
- type: nauc_recall_at_10_max
value: 4.202399088421976
- type: nauc_recall_at_10_std
value: -18.38389025404673
- type: nauc_recall_at_1_diff1
value: 29.325201664812788
- type: nauc_recall_at_1_max
value: -11.742800494823971
- type: nauc_recall_at_1_std
value: -18.610215769702528
- type: nauc_recall_at_20_diff1
value: 9.480709993616845
- type: nauc_recall_at_20_max
value: 19.05670963725301
- type: nauc_recall_at_20_std
value: -13.266821166158651
- type: nauc_recall_at_3_diff1
value: 17.24526030340978
- type: nauc_recall_at_3_max
value: -4.202455033452323
- type: nauc_recall_at_3_std
value: -17.51426403995538
- type: nauc_recall_at_5_diff1
value: 12.074628162049992
- type: nauc_recall_at_5_max
value: -1.914550146110865
- type: nauc_recall_at_5_std
value: -19.162525528916362
- type: ndcg_at_1
value: 41.607
- type: ndcg_at_10
value: 65.269
- type: ndcg_at_100
value: 67.289
- type: ndcg_at_1000
value: 67.29899999999999
- type: ndcg_at_20
value: 66.76299999999999
- type: ndcg_at_3
value: 56.604
- type: ndcg_at_5
value: 61.07900000000001
- type: precision_at_1
value: 41.607
- type: precision_at_10
value: 9.118
- type: precision_at_100
value: 0.996
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 4.8469999999999995
- type: precision_at_3
value: 22.451
- type: precision_at_5
value: 15.647
- type: recall_at_1
value: 41.607
- type: recall_at_10
value: 91.181
- type: recall_at_100
value: 99.57300000000001
- type: recall_at_1000
value: 99.644
- type: recall_at_20
value: 96.942
- type: recall_at_3
value: 67.354
- type: recall_at_5
value: 78.236
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: main_score
value: 55.437138353189994
- type: v_measure
value: 55.437138353189994
- type: v_measure_std
value: 14.718556601335491
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: main_score
value: 50.65858459544658
- type: v_measure
value: 50.65858459544658
- type: v_measure_std
value: 14.887033747525146
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: main_score
value: 67.32597152838535
- type: map
value: 67.32597152838535
- type: mrr
value: 78.98683111286988
- type: nAUC_map_diff1
value: 16.8624639710487
- type: nAUC_map_max
value: 24.91996491142433
- type: nAUC_map_std
value: 17.91865808793225
- type: nAUC_mrr_diff1
value: 25.03766425631947
- type: nAUC_mrr_max
value: 41.64561939958336
- type: nAUC_mrr_std
value: 23.179909345891968
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cosine_pearson
value: 85.790820496042
- type: cosine_spearman
value: 83.10731534330517
- type: euclidean_pearson
value: 84.61741304343133
- type: euclidean_spearman
value: 83.17297949010973
- type: main_score
value: 83.10731534330517
- type: manhattan_pearson
value: 85.2137696526676
- type: manhattan_spearman
value: 84.39168195786738
- type: pearson
value: 85.790820496042
- type: spearman
value: 83.10731534330517
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 89.78896103896105
- type: f1
value: 89.76107366333488
- type: f1_weighted
value: 89.76107366333488
- type: main_score
value: 89.78896103896105
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: main_score
value: 50.68092296236376
- type: v_measure
value: 50.68092296236376
- type: v_measure_std
value: 0.7832640983085436
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: main_score
value: 46.86629236732983
- type: v_measure
value: 46.86629236732983
- type: v_measure_std
value: 0.8784322236350974
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: mteb/cqadupstack
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: main_score
value: 47.74883333333334
- type: map_at_1
value: 30.179249999999996
- type: map_at_10
value: 41.60824999999999
- type: map_at_100
value: 42.94008333333332
- type: map_at_1000
value: 43.04666666666667
- type: map_at_20
value: 42.36833333333334
- type: map_at_3
value: 38.23491666666666
- type: map_at_5
value: 40.10183333333333
- type: mrr_at_1
value: 36.47676085808166
- type: mrr_at_10
value: 46.300991916437155
- type: mrr_at_100
value: 47.12155753713262
- type: mrr_at_1000
value: 47.168033610799945
- type: mrr_at_20
value: 46.80405724560391
- type: mrr_at_3
value: 43.77000352801797
- type: mrr_at_5
value: 45.22295361704542
- type: nauc_map_at_1000_diff1
value: 46.953671666941524
- type: nauc_map_at_1000_max
value: 32.260396316089675
- type: nauc_map_at_1000_std
value: 0.6657766120094878
- type: nauc_map_at_100_diff1
value: 46.94717463394555
- type: nauc_map_at_100_max
value: 32.25088350678177
- type: nauc_map_at_100_std
value: 0.6257017014549283
- type: nauc_map_at_10_diff1
value: 46.974678429336464
- type: nauc_map_at_10_max
value: 31.862230807295504
- type: nauc_map_at_10_std
value: -0.14758828549579284
- type: nauc_map_at_1_diff1
value: 52.48913346466124
- type: nauc_map_at_1_max
value: 29.874374024967725
- type: nauc_map_at_1_std
value: -2.433547569836134
- type: nauc_map_at_20_diff1
value: 46.96088684217651
- type: nauc_map_at_20_max
value: 32.08954208613205
- type: nauc_map_at_20_std
value: 0.25946321113436527
- type: nauc_map_at_3_diff1
value: 47.703230121518345
- type: nauc_map_at_3_max
value: 30.977880095983107
- type: nauc_map_at_3_std
value: -1.342777563991804
- type: nauc_map_at_5_diff1
value: 47.1615010199957
- type: nauc_map_at_5_max
value: 31.420885812683284
- type: nauc_map_at_5_std
value: -0.8789297099444306
- type: nauc_mrr_at_1000_diff1
value: 46.69178645962615
- type: nauc_mrr_at_1000_max
value: 34.392807413340655
- type: nauc_mrr_at_1000_std
value: 1.6155464863667934
- type: nauc_mrr_at_100_diff1
value: 46.67417236349189
- type: nauc_mrr_at_100_max
value: 34.384607045512624
- type: nauc_mrr_at_100_std
value: 1.6259917384109652
- type: nauc_mrr_at_10_diff1
value: 46.60497560446239
- type: nauc_mrr_at_10_max
value: 34.32918897817958
- type: nauc_mrr_at_10_std
value: 1.39387793769014
- type: nauc_mrr_at_1_diff1
value: 51.61608573254137
- type: nauc_mrr_at_1_max
value: 35.18105023234596
- type: nauc_mrr_at_1_std
value: 0.17943702145478177
- type: nauc_mrr_at_20_diff1
value: 46.635943069860254
- type: nauc_mrr_at_20_max
value: 34.37050973118794
- type: nauc_mrr_at_20_std
value: 1.5346464678860607
- type: nauc_mrr_at_3_diff1
value: 47.154389369038334
- type: nauc_mrr_at_3_max
value: 34.41036411855465
- type: nauc_mrr_at_3_std
value: 0.924551812357872
- type: nauc_mrr_at_5_diff1
value: 46.6690101691763
- type: nauc_mrr_at_5_max
value: 34.29740388138466
- type: nauc_mrr_at_5_std
value: 1.0567184149139792
- type: nauc_ndcg_at_1000_diff1
value: 45.375448289173264
- type: nauc_ndcg_at_1000_max
value: 33.47957083714482
- type: nauc_ndcg_at_1000_std
value: 3.192251100225568
- type: nauc_ndcg_at_100_diff1
value: 44.93601014699499
- type: nauc_ndcg_at_100_max
value: 33.21249888295249
- type: nauc_ndcg_at_100_std
value: 3.609842852934217
- type: nauc_ndcg_at_10_diff1
value: 44.87893284011915
- type: nauc_ndcg_at_10_max
value: 32.384885249478515
- type: nauc_ndcg_at_10_std
value: 1.454493065035396
- type: nauc_ndcg_at_1_diff1
value: 51.61608573254137
- type: nauc_ndcg_at_1_max
value: 35.18105023234596
- type: nauc_ndcg_at_1_std
value: 0.17943702145478177
- type: nauc_ndcg_at_20_diff1
value: 44.867752179050605
- type: nauc_ndcg_at_20_max
value: 32.689535921840196
- type: nauc_ndcg_at_20_std
value: 2.337765158573901
- type: nauc_ndcg_at_3_diff1
value: 45.87485821381341
- type: nauc_ndcg_at_3_max
value: 32.33282450558947
- type: nauc_ndcg_at_3_std
value: 0.0681643829273283
- type: nauc_ndcg_at_5_diff1
value: 45.202902131892394
- type: nauc_ndcg_at_5_max
value: 32.1026971523917
- type: nauc_ndcg_at_5_std
value: 0.3565572833774486
- type: nauc_precision_at_1000_diff1
value: -8.935267931198956
- type: nauc_precision_at_1000_max
value: 6.464981960169269
- type: nauc_precision_at_1000_std
value: 10.662786182234633
- type: nauc_precision_at_100_diff1
value: -1.64091517847155
- type: nauc_precision_at_100_max
value: 15.175617871025024
- type: nauc_precision_at_100_std
value: 16.924256989248075
- type: nauc_precision_at_10_diff1
value: 15.676651966277047
- type: nauc_precision_at_10_max
value: 26.243734188847117
- type: nauc_precision_at_10_std
value: 10.601741034956333
- type: nauc_precision_at_1_diff1
value: 51.61608573254137
- type: nauc_precision_at_1_max
value: 35.18105023234596
- type: nauc_precision_at_1_std
value: 0.17943702145478177
- type: nauc_precision_at_20_diff1
value: 9.447267260198654
- type: nauc_precision_at_20_max
value: 23.024130858142723
- type: nauc_precision_at_20_std
value: 13.739145648899603
- type: nauc_precision_at_3_diff1
value: 30.11583572134629
- type: nauc_precision_at_3_max
value: 31.37321080069495
- type: nauc_precision_at_3_std
value: 4.705512374126024
- type: nauc_precision_at_5_diff1
value: 23.192015335996093
- type: nauc_precision_at_5_max
value: 29.415746835998764
- type: nauc_precision_at_5_std
value: 6.843498772798558
- type: nauc_recall_at_1000_diff1
value: 25.36573313426033
- type: nauc_recall_at_1000_max
value: 43.06672256524168
- type: nauc_recall_at_1000_std
value: 47.93664853815292
- type: nauc_recall_at_100_diff1
value: 31.222880916617406
- type: nauc_recall_at_100_max
value: 31.761159904172658
- type: nauc_recall_at_100_std
value: 23.034218976635877
- type: nauc_recall_at_10_diff1
value: 36.23439028915225
- type: nauc_recall_at_10_max
value: 28.473458977606438
- type: nauc_recall_at_10_std
value: 3.7797969934159
- type: nauc_recall_at_1_diff1
value: 52.48913346466124
- type: nauc_recall_at_1_max
value: 29.874374024967725
- type: nauc_recall_at_1_std
value: -2.433547569836134
- type: nauc_recall_at_20_diff1
value: 34.678676952584766
- type: nauc_recall_at_20_max
value: 29.04638392522168
- type: nauc_recall_at_20_std
value: 8.148894982082549
- type: nauc_recall_at_3_diff1
value: 41.31029996231311
- type: nauc_recall_at_3_max
value: 28.44199443414157
- type: nauc_recall_at_3_std
value: -0.747324057600377
- type: nauc_recall_at_5_diff1
value: 38.535873899920674
- type: nauc_recall_at_5_max
value: 27.942667805948375
- type: nauc_recall_at_5_std
value: 0.30652206930973686
- type: ndcg_at_1
value: 36.47675
- type: ndcg_at_10
value: 47.74883333333334
- type: ndcg_at_100
value: 52.902416666666674
- type: ndcg_at_1000
value: 54.69116666666667
- type: ndcg_at_20
value: 49.89758333333333
- type: ndcg_at_3
value: 42.462250000000004
- type: ndcg_at_5
value: 44.91841666666667
- type: precision_at_1
value: 36.47675
- type: precision_at_10
value: 8.582416666666665
- type: precision_at_100
value: 1.31475
- type: precision_at_1000
value: 0.16458333333333333
- type: precision_at_20
value: 5.021833333333333
- type: precision_at_3
value: 20.004499999999997
- type: precision_at_5
value: 14.178666666666665
- type: recall_at_1
value: 30.179249999999996
- type: recall_at_10
value: 60.950166666666675
- type: recall_at_100
value: 83.19025
- type: recall_at_1000
value: 95.27774999999998
- type: recall_at_20
value: 68.80175
- type: recall_at_3
value: 46.01841666666666
- type: recall_at_5
value: 52.482416666666666
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: mteb/climate-fever
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: main_score
value: 46.113
- type: map_at_1
value: 20.122999999999998
- type: map_at_10
value: 35.474
- type: map_at_100
value: 37.592
- type: map_at_1000
value: 37.773
- type: map_at_20
value: 36.637
- type: map_at_3
value: 29.731
- type: map_at_5
value: 32.964
- type: mrr_at_1
value: 46.71009771986971
- type: mrr_at_10
value: 58.855669303552105
- type: mrr_at_100
value: 59.389249674038425
- type: mrr_at_1000
value: 59.408448104362364
- type: mrr_at_20
value: 59.23881203149016
- type: mrr_at_3
value: 56.18892508143328
- type: mrr_at_5
value: 57.85342019543985
- type: nauc_map_at_1000_diff1
value: 27.047031037721958
- type: nauc_map_at_1000_max
value: 43.25240279148033
- type: nauc_map_at_1000_std
value: 20.795849418696037
- type: nauc_map_at_100_diff1
value: 27.044739015116452
- type: nauc_map_at_100_max
value: 43.24042159787812
- type: nauc_map_at_100_std
value: 20.799952124137683
- type: nauc_map_at_10_diff1
value: 27.372696854670338
- type: nauc_map_at_10_max
value: 43.054456574721684
- type: nauc_map_at_10_std
value: 19.537162110136645
- type: nauc_map_at_1_diff1
value: 43.65424623953092
- type: nauc_map_at_1_max
value: 45.17986509998762
- type: nauc_map_at_1_std
value: 8.497107052335414
- type: nauc_map_at_20_diff1
value: 27.224535846566074
- type: nauc_map_at_20_max
value: 43.12222854561229
- type: nauc_map_at_20_std
value: 20.29982972202669
- type: nauc_map_at_3_diff1
value: 30.87847002319001
- type: nauc_map_at_3_max
value: 42.890027891707575
- type: nauc_map_at_3_std
value: 13.857451947580929
- type: nauc_map_at_5_diff1
value: 27.966867093591542
- type: nauc_map_at_5_max
value: 42.35826637592201
- type: nauc_map_at_5_std
value: 16.993102524058624
- type: nauc_mrr_at_1000_diff1
value: 30.191544077608164
- type: nauc_mrr_at_1000_max
value: 44.959438920351644
- type: nauc_mrr_at_1000_std
value: 24.065801376465114
- type: nauc_mrr_at_100_diff1
value: 30.170368115494
- type: nauc_mrr_at_100_max
value: 44.955868115761156
- type: nauc_mrr_at_100_std
value: 24.093510767847707
- type: nauc_mrr_at_10_diff1
value: 30.128430637520175
- type: nauc_mrr_at_10_max
value: 44.97689261350708
- type: nauc_mrr_at_10_std
value: 24.037049561818897
- type: nauc_mrr_at_1_diff1
value: 35.323351939108214
- type: nauc_mrr_at_1_max
value: 43.85026244855636
- type: nauc_mrr_at_1_std
value: 17.040662141218974
- type: nauc_mrr_at_20_diff1
value: 30.192006556160443
- type: nauc_mrr_at_20_max
value: 45.02814530774032
- type: nauc_mrr_at_20_std
value: 24.20885865448696
- type: nauc_mrr_at_3_diff1
value: 29.88250163424518
- type: nauc_mrr_at_3_max
value: 44.25768944883186
- type: nauc_mrr_at_3_std
value: 22.804183393364198
- type: nauc_mrr_at_5_diff1
value: 30.269824490420767
- type: nauc_mrr_at_5_max
value: 44.97443265796657
- type: nauc_mrr_at_5_std
value: 23.894159916141177
- type: nauc_ndcg_at_1000_diff1
value: 24.533764005407356
- type: nauc_ndcg_at_1000_max
value: 44.50902713386608
- type: nauc_ndcg_at_1000_std
value: 27.589506980238404
- type: nauc_ndcg_at_100_diff1
value: 24.209785073940353
- type: nauc_ndcg_at_100_max
value: 44.18257063893669
- type: nauc_ndcg_at_100_std
value: 27.963150866401943
- type: nauc_ndcg_at_10_diff1
value: 25.168069201989486
- type: nauc_ndcg_at_10_max
value: 43.84940910683214
- type: nauc_ndcg_at_10_std
value: 24.810707270956435
- type: nauc_ndcg_at_1_diff1
value: 35.323351939108214
- type: nauc_ndcg_at_1_max
value: 43.85026244855636
- type: nauc_ndcg_at_1_std
value: 17.040662141218974
- type: nauc_ndcg_at_20_diff1
value: 24.829924800466834
- type: nauc_ndcg_at_20_max
value: 43.738574327059716
- type: nauc_ndcg_at_20_std
value: 26.252370278684072
- type: nauc_ndcg_at_3_diff1
value: 27.321943393906274
- type: nauc_ndcg_at_3_max
value: 42.16584786993447
- type: nauc_ndcg_at_3_std
value: 18.24775079455969
- type: nauc_ndcg_at_5_diff1
value: 26.043785418347998
- type: nauc_ndcg_at_5_max
value: 42.874593895388344
- type: nauc_ndcg_at_5_std
value: 21.294004555506117
- type: nauc_precision_at_1000_diff1
value: -22.073027615308582
- type: nauc_precision_at_1000_max
value: -6.549723766317357
- type: nauc_precision_at_1000_std
value: 18.301749191241306
- type: nauc_precision_at_100_diff1
value: -15.654286887593619
- type: nauc_precision_at_100_max
value: 6.401516251421999
- type: nauc_precision_at_100_std
value: 29.170680324929805
- type: nauc_precision_at_10_diff1
value: -4.362381972892247
- type: nauc_precision_at_10_max
value: 22.10943515872447
- type: nauc_precision_at_10_std
value: 31.869699459530022
- type: nauc_precision_at_1_diff1
value: 35.323351939108214
- type: nauc_precision_at_1_max
value: 43.85026244855636
- type: nauc_precision_at_1_std
value: 17.040662141218974
- type: nauc_precision_at_20_diff1
value: -7.50749661117875
- type: nauc_precision_at_20_max
value: 16.80584016023257
- type: nauc_precision_at_20_std
value: 31.976755897112437
- type: nauc_precision_at_3_diff1
value: 7.402667538773083
- type: nauc_precision_at_3_max
value: 31.2088401330676
- type: nauc_precision_at_3_std
value: 24.287905698405662
- type: nauc_precision_at_5_diff1
value: 0.7479172565343901
- type: nauc_precision_at_5_max
value: 26.28427734237825
- type: nauc_precision_at_5_std
value: 28.246947120310317
- type: nauc_recall_at_1000_diff1
value: 2.4778431086370496
- type: nauc_recall_at_1000_max
value: 40.2231995797509
- type: nauc_recall_at_1000_std
value: 52.62124052183862
- type: nauc_recall_at_100_diff1
value: 8.960962419741463
- type: nauc_recall_at_100_max
value: 35.81132850291491
- type: nauc_recall_at_100_std
value: 40.020903251786166
- type: nauc_recall_at_10_diff1
value: 15.603400751376636
- type: nauc_recall_at_10_max
value: 37.570127529136485
- type: nauc_recall_at_10_std
value: 28.07128410238545
- type: nauc_recall_at_1_diff1
value: 43.65424623953092
- type: nauc_recall_at_1_max
value: 45.17986509998762
- type: nauc_recall_at_1_std
value: 8.497107052335414
- type: nauc_recall_at_20_diff1
value: 13.844820282832346
- type: nauc_recall_at_20_max
value: 36.0106148516309
- type: nauc_recall_at_20_std
value: 31.453103910565254
- type: nauc_recall_at_3_diff1
value: 24.359328154117748
- type: nauc_recall_at_3_max
value: 39.93774251377568
- type: nauc_recall_at_3_std
value: 16.214921517509648
- type: nauc_recall_at_5_diff1
value: 18.75788451360292
- type: nauc_recall_at_5_max
value: 38.177646107055516
- type: nauc_recall_at_5_std
value: 22.17196825834675
- type: ndcg_at_1
value: 46.71
- type: ndcg_at_10
value: 46.113
- type: ndcg_at_100
value: 53.035
- type: ndcg_at_1000
value: 55.724
- type: ndcg_at_20
value: 48.929
- type: ndcg_at_3
value: 39.501999999999995
- type: ndcg_at_5
value: 41.792
- type: precision_at_1
value: 46.71
- type: precision_at_10
value: 14.274000000000001
- type: precision_at_100
value: 2.1870000000000003
- type: precision_at_1000
value: 0.269
- type: precision_at_20
value: 8.375
- type: precision_at_3
value: 29.881
- type: precision_at_5
value: 22.697
- type: recall_at_1
value: 20.122999999999998
- type: recall_at_10
value: 52.22
- type: recall_at_100
value: 75.388
- type: recall_at_1000
value: 89.938
- type: recall_at_20
value: 60.077000000000005
- type: recall_at_3
value: 35.150999999999996
- type: recall_at_5
value: 42.748000000000005
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: mteb/dbpedia
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: main_score
value: 52.276999999999994
- type: map_at_1
value: 9.949
- type: map_at_10
value: 24.891
- type: map_at_100
value: 37.111
- type: map_at_1000
value: 39.266
- type: map_at_20
value: 29.685
- type: map_at_3
value: 16.586000000000002
- type: map_at_5
value: 19.982
- type: mrr_at_1
value: 76.25
- type: mrr_at_10
value: 82.4518849206349
- type: mrr_at_100
value: 82.70302194564499
- type: mrr_at_1000
value: 82.70909729942254
- type: mrr_at_20
value: 82.60492765962964
- type: mrr_at_3
value: 81.33333333333331
- type: mrr_at_5
value: 82.14583333333331
- type: nauc_map_at_1000_diff1
value: 21.427201262456556
- type: nauc_map_at_1000_max
value: 35.357361590816076
- type: nauc_map_at_1000_std
value: 24.785419223353717
- type: nauc_map_at_100_diff1
value: 22.82358692021537
- type: nauc_map_at_100_max
value: 35.07399692072945
- type: nauc_map_at_100_std
value: 22.679878828987025
- type: nauc_map_at_10_diff1
value: 26.491769223479643
- type: nauc_map_at_10_max
value: 20.78079385443902
- type: nauc_map_at_10_std
value: -4.910406292079661
- type: nauc_map_at_1_diff1
value: 35.20851030208876
- type: nauc_map_at_1_max
value: 5.783003346365858
- type: nauc_map_at_1_std
value: -21.11679133835354
- type: nauc_map_at_20_diff1
value: 24.80097499300491
- type: nauc_map_at_20_max
value: 26.807021360774975
- type: nauc_map_at_20_std
value: 4.793103995429955
- type: nauc_map_at_3_diff1
value: 29.238193458890173
- type: nauc_map_at_3_max
value: 10.300839972189456
- type: nauc_map_at_3_std
value: -17.889666731981592
- type: nauc_map_at_5_diff1
value: 28.773624870573926
- type: nauc_map_at_5_max
value: 14.951435645422887
- type: nauc_map_at_5_std
value: -13.319697827173565
- type: nauc_mrr_at_1000_diff1
value: 55.232544856708785
- type: nauc_mrr_at_1000_max
value: 64.73225637682637
- type: nauc_mrr_at_1000_std
value: 37.57480399594188
- type: nauc_mrr_at_100_diff1
value: 55.219251601773735
- type: nauc_mrr_at_100_max
value: 64.73305063663611
- type: nauc_mrr_at_100_std
value: 37.56458562909293
- type: nauc_mrr_at_10_diff1
value: 55.123463838253464
- type: nauc_mrr_at_10_max
value: 64.91914041040233
- type: nauc_mrr_at_10_std
value: 37.76482503851598
- type: nauc_mrr_at_1_diff1
value: 56.45461238513347
- type: nauc_mrr_at_1_max
value: 63.11782510293676
- type: nauc_mrr_at_1_std
value: 33.592561284868985
- type: nauc_mrr_at_20_diff1
value: 55.15401961460458
- type: nauc_mrr_at_20_max
value: 64.77145835613156
- type: nauc_mrr_at_20_std
value: 37.471561418305804
- type: nauc_mrr_at_3_diff1
value: 54.64387438697658
- type: nauc_mrr_at_3_max
value: 64.27618995019164
- type: nauc_mrr_at_3_std
value: 39.391637295269014
- type: nauc_mrr_at_5_diff1
value: 55.08702591239485
- type: nauc_mrr_at_5_max
value: 64.6071475650635
- type: nauc_mrr_at_5_std
value: 37.97185134269896
- type: nauc_ndcg_at_1000_diff1
value: 31.696698876400387
- type: nauc_ndcg_at_1000_max
value: 52.12183760001191
- type: nauc_ndcg_at_1000_std
value: 40.197596211778716
- type: nauc_ndcg_at_100_diff1
value: 33.253120193433666
- type: nauc_ndcg_at_100_max
value: 49.47167758554746
- type: nauc_ndcg_at_100_std
value: 32.643833139756204
- type: nauc_ndcg_at_10_diff1
value: 27.065541392580013
- type: nauc_ndcg_at_10_max
value: 45.83504281289289
- type: nauc_ndcg_at_10_std
value: 27.11739500732328
- type: nauc_ndcg_at_1_diff1
value: 49.42808250022517
- type: nauc_ndcg_at_1_max
value: 53.502615048520354
- type: nauc_ndcg_at_1_std
value: 27.17555908836708
- type: nauc_ndcg_at_20_diff1
value: 29.374791382330308
- type: nauc_ndcg_at_20_max
value: 43.91246842479055
- type: nauc_ndcg_at_20_std
value: 23.419410620550316
- type: nauc_ndcg_at_3_diff1
value: 26.71550354496204
- type: nauc_ndcg_at_3_max
value: 43.9641457892003
- type: nauc_ndcg_at_3_std
value: 27.320024167947686
- type: nauc_ndcg_at_5_diff1
value: 27.020654974589487
- type: nauc_ndcg_at_5_max
value: 46.130417266030584
- type: nauc_ndcg_at_5_std
value: 28.392009019010068
- type: nauc_precision_at_1000_diff1
value: -21.47455482181002
- type: nauc_precision_at_1000_max
value: -9.721907229236024
- type: nauc_precision_at_1000_std
value: -1.061132062651487
- type: nauc_precision_at_100_diff1
value: -12.35759246101943
- type: nauc_precision_at_100_max
value: 15.509512444892168
- type: nauc_precision_at_100_std
value: 36.21183578592014
- type: nauc_precision_at_10_diff1
value: -6.136998947343125
- type: nauc_precision_at_10_max
value: 32.30037906748288
- type: nauc_precision_at_10_std
value: 41.4500302476981
- type: nauc_precision_at_1_diff1
value: 56.45461238513347
- type: nauc_precision_at_1_max
value: 63.11782510293676
- type: nauc_precision_at_1_std
value: 33.592561284868985
- type: nauc_precision_at_20_diff1
value: -7.335890123683174
- type: nauc_precision_at_20_max
value: 28.31417075291312
- type: nauc_precision_at_20_std
value: 41.405935715061815
- type: nauc_precision_at_3_diff1
value: 7.117255890225942
- type: nauc_precision_at_3_max
value: 39.19894132683829
- type: nauc_precision_at_3_std
value: 38.48255841994843
- type: nauc_precision_at_5_diff1
value: 1.861523090114206
- type: nauc_precision_at_5_max
value: 38.11649223007208
- type: nauc_precision_at_5_std
value: 40.52993530374645
- type: nauc_recall_at_1000_diff1
value: 26.497648584314636
- type: nauc_recall_at_1000_max
value: 44.48069746734414
- type: nauc_recall_at_1000_std
value: 53.16438130228715
- type: nauc_recall_at_100_diff1
value: 26.353456899511446
- type: nauc_recall_at_100_max
value: 37.57379787884197
- type: nauc_recall_at_100_std
value: 29.197468295989548
- type: nauc_recall_at_10_diff1
value: 22.80445738351114
- type: nauc_recall_at_10_max
value: 15.895630778449046
- type: nauc_recall_at_10_std
value: -8.746224797644501
- type: nauc_recall_at_1_diff1
value: 35.20851030208876
- type: nauc_recall_at_1_max
value: 5.783003346365858
- type: nauc_recall_at_1_std
value: -21.11679133835354
- type: nauc_recall_at_20_diff1
value: 22.34028867678706
- type: nauc_recall_at_20_max
value: 21.42373427646772
- type: nauc_recall_at_20_std
value: 0.4533036151015875
- type: nauc_recall_at_3_diff1
value: 24.96853445599229
- type: nauc_recall_at_3_max
value: 6.245185375804208
- type: nauc_recall_at_3_std
value: -20.200240127099622
- type: nauc_recall_at_5_diff1
value: 24.749259476710623
- type: nauc_recall_at_5_max
value: 11.024592845995942
- type: nauc_recall_at_5_std
value: -16.15683085641543
- type: ndcg_at_1
value: 64.125
- type: ndcg_at_10
value: 52.276999999999994
- type: ndcg_at_100
value: 57.440000000000005
- type: ndcg_at_1000
value: 64.082
- type: ndcg_at_20
value: 51.383
- type: ndcg_at_3
value: 55.769000000000005
- type: ndcg_at_5
value: 53.978
- type: precision_at_1
value: 76.25
- type: precision_at_10
value: 43.05
- type: precision_at_100
value: 14.09
- type: precision_at_1000
value: 2.662
- type: precision_at_20
value: 33.112
- type: precision_at_3
value: 59.833000000000006
- type: precision_at_5
value: 53.05
- type: recall_at_1
value: 9.949
- type: recall_at_10
value: 30.424
- type: recall_at_100
value: 64.062
- type: recall_at_1000
value: 85.916
- type: recall_at_20
value: 39.895
- type: recall_at_3
value: 17.876
- type: recall_at_5
value: 22.536
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 84.29499999999999
- type: f1
value: 79.76188258172078
- type: f1_weighted
value: 84.96026012933847
- type: main_score
value: 84.29499999999999
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: mteb/fever
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: main_score
value: 94.83200000000001
- type: map_at_1
value: 87.339
- type: map_at_10
value: 92.92099999999999
- type: map_at_100
value: 93.108
- type: map_at_1000
value: 93.116
- type: map_at_20
value: 93.041
- type: map_at_3
value: 92.219
- type: map_at_5
value: 92.664
- type: mrr_at_1
value: 93.99939993999399
- type: mrr_at_10
value: 96.55188137861403
- type: mrr_at_100
value: 96.5652366009286
- type: mrr_at_1000
value: 96.5652625550811
- type: mrr_at_20
value: 96.5601781754844
- type: mrr_at_3
value: 96.45714571457142
- type: mrr_at_5
value: 96.544904490449
- type: nauc_map_at_1000_diff1
value: 51.81676454961933
- type: nauc_map_at_1000_max
value: 24.904822914926118
- type: nauc_map_at_1000_std
value: -3.8110347821630404
- type: nauc_map_at_100_diff1
value: 51.77514975011158
- type: nauc_map_at_100_max
value: 24.912497341800094
- type: nauc_map_at_100_std
value: -3.76229517662447
- type: nauc_map_at_10_diff1
value: 51.29608296382479
- type: nauc_map_at_10_max
value: 24.78704970246707
- type: nauc_map_at_10_std
value: -3.723130815783328
- type: nauc_map_at_1_diff1
value: 59.90813138005125
- type: nauc_map_at_1_max
value: 24.58479295693794
- type: nauc_map_at_1_std
value: -8.056152492777027
- type: nauc_map_at_20_diff1
value: 51.428639331678326
- type: nauc_map_at_20_max
value: 24.849214517705086
- type: nauc_map_at_20_std
value: -3.685550123874596
- type: nauc_map_at_3_diff1
value: 50.94399923719279
- type: nauc_map_at_3_max
value: 24.359700180006207
- type: nauc_map_at_3_std
value: -5.407767408816422
- type: nauc_map_at_5_diff1
value: 50.767302682959546
- type: nauc_map_at_5_max
value: 24.491113461892215
- type: nauc_map_at_5_std
value: -4.058336127339082
- type: nauc_mrr_at_1000_diff1
value: 79.86042313551833
- type: nauc_mrr_at_1000_max
value: 23.20960445633933
- type: nauc_mrr_at_1000_std
value: -23.54334295120471
- type: nauc_mrr_at_100_diff1
value: 79.85991247027636
- type: nauc_mrr_at_100_max
value: 23.210085926780106
- type: nauc_mrr_at_100_std
value: -23.542508200789197
- type: nauc_mrr_at_10_diff1
value: 79.71095155563415
- type: nauc_mrr_at_10_max
value: 23.24128650883908
- type: nauc_mrr_at_10_std
value: -23.408502781834102
- type: nauc_mrr_at_1_diff1
value: 82.6349900233902
- type: nauc_mrr_at_1_max
value: 21.994548214014227
- type: nauc_mrr_at_1_std
value: -22.549769792179262
- type: nauc_mrr_at_20_diff1
value: 79.76465012873038
- type: nauc_mrr_at_20_max
value: 23.17575026523213
- type: nauc_mrr_at_20_std
value: -23.492660166315048
- type: nauc_mrr_at_3_diff1
value: 79.91074933379953
- type: nauc_mrr_at_3_max
value: 24.14246499097892
- type: nauc_mrr_at_3_std
value: -25.22601708389664
- type: nauc_mrr_at_5_diff1
value: 79.62092651565847
- type: nauc_mrr_at_5_max
value: 23.315937737034425
- type: nauc_mrr_at_5_std
value: -23.317659360058403
- type: nauc_ndcg_at_1000_diff1
value: 54.404537986779225
- type: nauc_ndcg_at_1000_max
value: 25.38408304128995
- type: nauc_ndcg_at_1000_std
value: -4.916709117696968
- type: nauc_ndcg_at_100_diff1
value: 53.2448598868241
- type: nauc_ndcg_at_100_max
value: 25.75325255295546
- type: nauc_ndcg_at_100_std
value: -3.680507005630751
- type: nauc_ndcg_at_10_diff1
value: 50.81057355170232
- type: nauc_ndcg_at_10_max
value: 25.006448273343807
- type: nauc_ndcg_at_10_std
value: -2.8979899112515577
- type: nauc_ndcg_at_1_diff1
value: 82.6349900233902
- type: nauc_ndcg_at_1_max
value: 21.994548214014227
- type: nauc_ndcg_at_1_std
value: -22.549769792179262
- type: nauc_ndcg_at_20_diff1
value: 51.205023097166304
- type: nauc_ndcg_at_20_max
value: 25.22133626556826
- type: nauc_ndcg_at_20_std
value: -2.9506328244150155
- type: nauc_ndcg_at_3_diff1
value: 51.79780256736321
- type: nauc_ndcg_at_3_max
value: 24.81137324438439
- type: nauc_ndcg_at_3_std
value: -6.881223858227807
- type: nauc_ndcg_at_5_diff1
value: 50.290038260564565
- type: nauc_ndcg_at_5_max
value: 24.57250792165796
- type: nauc_ndcg_at_5_std
value: -3.5124628344654596
- type: nauc_precision_at_1000_diff1
value: -20.215211396894333
- type: nauc_precision_at_1000_max
value: -14.165452298769171
- type: nauc_precision_at_1000_std
value: -2.0952871214470816
- type: nauc_precision_at_100_diff1
value: -22.340257474494607
- type: nauc_precision_at_100_max
value: -12.697885641360282
- type: nauc_precision_at_100_std
value: 1.0688624940286244
- type: nauc_precision_at_10_diff1
value: -24.78271817420798
- type: nauc_precision_at_10_max
value: -12.625257500222656
- type: nauc_precision_at_10_std
value: 3.223250450607087
- type: nauc_precision_at_1_diff1
value: 82.6349900233902
- type: nauc_precision_at_1_max
value: 21.994548214014227
- type: nauc_precision_at_1_std
value: -22.549769792179262
- type: nauc_precision_at_20_diff1
value: -24.375756227194177
- type: nauc_precision_at_20_max
value: -12.341015011563536
- type: nauc_precision_at_20_std
value: 2.7475274619387955
- type: nauc_precision_at_3_diff1
value: -24.8251306777365
- type: nauc_precision_at_3_max
value: -13.109579709589042
- type: nauc_precision_at_3_std
value: -1.2233442335420748
- type: nauc_precision_at_5_diff1
value: -26.955418583344894
- type: nauc_precision_at_5_max
value: -13.598630838071015
- type: nauc_precision_at_5_std
value: 2.545780631940738
- type: nauc_recall_at_1000_diff1
value: 0.2542680835344437
- type: nauc_recall_at_1000_max
value: 49.38194243035277
- type: nauc_recall_at_1000_std
value: 57.021502715846026
- type: nauc_recall_at_100_diff1
value: 5.062154815367015
- type: nauc_recall_at_100_max
value: 45.41178380188437
- type: nauc_recall_at_100_std
value: 50.78382225901813
- type: nauc_recall_at_10_diff1
value: 20.429153629007818
- type: nauc_recall_at_10_max
value: 27.516855026155508
- type: nauc_recall_at_10_std
value: 21.367491371755467
- type: nauc_recall_at_1_diff1
value: 59.90813138005125
- type: nauc_recall_at_1_max
value: 24.58479295693794
- type: nauc_recall_at_1_std
value: -8.056152492777027
- type: nauc_recall_at_20_diff1
value: 13.072430858896942
- type: nauc_recall_at_20_max
value: 29.5522659183247
- type: nauc_recall_at_20_std
value: 28.70569974090291
- type: nauc_recall_at_3_diff1
value: 30.419084482663617
- type: nauc_recall_at_3_max
value: 25.627389580252835
- type: nauc_recall_at_3_std
value: 2.5557690877637054
- type: nauc_recall_at_5_diff1
value: 22.92561435069869
- type: nauc_recall_at_5_max
value: 25.545265063475455
- type: nauc_recall_at_5_std
value: 14.736172663072786
- type: ndcg_at_1
value: 93.999
- type: ndcg_at_10
value: 94.83200000000001
- type: ndcg_at_100
value: 95.363
- type: ndcg_at_1000
value: 95.478
- type: ndcg_at_20
value: 95.077
- type: ndcg_at_3
value: 94.143
- type: ndcg_at_5
value: 94.525
- type: precision_at_1
value: 93.999
- type: precision_at_10
value: 11.029
- type: precision_at_100
value: 1.1560000000000001
- type: precision_at_1000
value: 0.11800000000000001
- type: precision_at_20
value: 5.62
- type: precision_at_3
value: 35.219
- type: precision_at_5
value: 21.584
- type: recall_at_1
value: 87.339
- type: recall_at_10
value: 97.026
- type: recall_at_100
value: 98.936
- type: recall_at_1000
value: 99.599
- type: recall_at_20
value: 97.744
- type: recall_at_3
value: 95.069
- type: recall_at_5
value: 96.177
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: mteb/fiqa
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: main_score
value: 60.480000000000004
- type: map_at_1
value: 31.529
- type: map_at_10
value: 52.081
- type: map_at_100
value: 54.342
- type: map_at_1000
value: 54.449000000000005
- type: map_at_20
value: 53.479
- type: map_at_3
value: 45.471000000000004
- type: map_at_5
value: 49.164
- type: mrr_at_1
value: 60.03086419753087
- type: mrr_at_10
value: 67.73754409171075
- type: mrr_at_100
value: 68.332432152368
- type: mrr_at_1000
value: 68.34150941774908
- type: mrr_at_20
value: 68.14780993838725
- type: mrr_at_3
value: 65.6378600823045
- type: mrr_at_5
value: 66.88014403292176
- type: nauc_map_at_1000_diff1
value: 45.36598134579052
- type: nauc_map_at_1000_max
value: 31.891451119906943
- type: nauc_map_at_1000_std
value: -15.41454384137943
- type: nauc_map_at_100_diff1
value: 45.31268291874018
- type: nauc_map_at_100_max
value: 31.811055683002092
- type: nauc_map_at_100_std
value: -15.348503855591417
- type: nauc_map_at_10_diff1
value: 45.22606983565892
- type: nauc_map_at_10_max
value: 30.46108534749699
- type: nauc_map_at_10_std
value: -16.618086029682555
- type: nauc_map_at_1_diff1
value: 49.94952823753276
- type: nauc_map_at_1_max
value: 13.770377574254548
- type: nauc_map_at_1_std
value: -14.946357968858653
- type: nauc_map_at_20_diff1
value: 45.29274207897926
- type: nauc_map_at_20_max
value: 31.27332015148257
- type: nauc_map_at_20_std
value: -15.782946115613129
- type: nauc_map_at_3_diff1
value: 47.94248233566038
- type: nauc_map_at_3_max
value: 24.022838776825456
- type: nauc_map_at_3_std
value: -17.103518542262208
- type: nauc_map_at_5_diff1
value: 45.85345590031722
- type: nauc_map_at_5_max
value: 27.78341379004547
- type: nauc_map_at_5_std
value: -17.490850791756326
- type: nauc_mrr_at_1000_diff1
value: 58.225141047822824
- type: nauc_mrr_at_1000_max
value: 43.39606904140525
- type: nauc_mrr_at_1000_std
value: -14.64093518199122
- type: nauc_mrr_at_100_diff1
value: 58.22137274179545
- type: nauc_mrr_at_100_max
value: 43.39567568136935
- type: nauc_mrr_at_100_std
value: -14.62512313985582
- type: nauc_mrr_at_10_diff1
value: 58.03217329957151
- type: nauc_mrr_at_10_max
value: 43.633561683075186
- type: nauc_mrr_at_10_std
value: -14.563703576023808
- type: nauc_mrr_at_1_diff1
value: 61.48979902647692
- type: nauc_mrr_at_1_max
value: 43.1938079066948
- type: nauc_mrr_at_1_std
value: -15.808138277440465
- type: nauc_mrr_at_20_diff1
value: 58.13185370150794
- type: nauc_mrr_at_20_max
value: 43.35607721183147
- type: nauc_mrr_at_20_std
value: -14.635812702971263
- type: nauc_mrr_at_3_diff1
value: 58.698963168321264
- type: nauc_mrr_at_3_max
value: 43.633129249785405
- type: nauc_mrr_at_3_std
value: -15.733246346983854
- type: nauc_mrr_at_5_diff1
value: 57.94156745229547
- type: nauc_mrr_at_5_max
value: 43.14152462640525
- type: nauc_mrr_at_5_std
value: -15.318685307750895
- type: nauc_ndcg_at_1000_diff1
value: 47.871896043731496
- type: nauc_ndcg_at_1000_max
value: 37.159845167533426
- type: nauc_ndcg_at_1000_std
value: -13.067288160833485
- type: nauc_ndcg_at_100_diff1
value: 47.046171407204426
- type: nauc_ndcg_at_100_max
value: 36.422514360855835
- type: nauc_ndcg_at_100_std
value: -11.636859259571441
- type: nauc_ndcg_at_10_diff1
value: 46.232628149078096
- type: nauc_ndcg_at_10_max
value: 34.82402625088358
- type: nauc_ndcg_at_10_std
value: -14.768545542980114
- type: nauc_ndcg_at_1_diff1
value: 61.48979902647692
- type: nauc_ndcg_at_1_max
value: 43.1938079066948
- type: nauc_ndcg_at_1_std
value: -15.808138277440465
- type: nauc_ndcg_at_20_diff1
value: 46.51116172390955
- type: nauc_ndcg_at_20_max
value: 35.36362650568298
- type: nauc_ndcg_at_20_std
value: -12.849406209182826
- type: nauc_ndcg_at_3_diff1
value: 47.39832263785871
- type: nauc_ndcg_at_3_max
value: 35.67466264628456
- type: nauc_ndcg_at_3_std
value: -17.257717349296943
- type: nauc_ndcg_at_5_diff1
value: 45.91049493804232
- type: nauc_ndcg_at_5_max
value: 33.8405091138445
- type: nauc_ndcg_at_5_std
value: -17.477069902735895
- type: nauc_precision_at_1000_diff1
value: -12.037873000917767
- type: nauc_precision_at_1000_max
value: 26.043220150002295
- type: nauc_precision_at_1000_std
value: 6.84910668321572
- type: nauc_precision_at_100_diff1
value: -9.383403459051864
- type: nauc_precision_at_100_max
value: 29.68713170610003
- type: nauc_precision_at_100_std
value: 10.079531587056152
- type: nauc_precision_at_10_diff1
value: 3.3433323353925135
- type: nauc_precision_at_10_max
value: 38.31790111725993
- type: nauc_precision_at_10_std
value: 0.7888123304710856
- type: nauc_precision_at_1_diff1
value: 61.48979902647692
- type: nauc_precision_at_1_max
value: 43.1938079066948
- type: nauc_precision_at_1_std
value: -15.808138277440465
- type: nauc_precision_at_20_diff1
value: -2.083500986294448
- type: nauc_precision_at_20_max
value: 35.77143835726343
- type: nauc_precision_at_20_std
value: 5.318547021874003
- type: nauc_precision_at_3_diff1
value: 23.335617788912586
- type: nauc_precision_at_3_max
value: 39.81973275320871
- type: nauc_precision_at_3_std
value: -8.442769390555561
- type: nauc_precision_at_5_diff1
value: 11.521087842589482
- type: nauc_precision_at_5_max
value: 39.527792539828255
- type: nauc_precision_at_5_std
value: -5.412729503701626
- type: nauc_recall_at_1000_diff1
value: 10.6830893047453
- type: nauc_recall_at_1000_max
value: 8.834504311238423
- type: nauc_recall_at_1000_std
value: 24.670754304859692
- type: nauc_recall_at_100_diff1
value: 20.646020385527358
- type: nauc_recall_at_100_max
value: 20.121595011523294
- type: nauc_recall_at_100_std
value: 19.42307459311791
- type: nauc_recall_at_10_diff1
value: 33.01029313733417
- type: nauc_recall_at_10_max
value: 27.948634980368702
- type: nauc_recall_at_10_std
value: -10.239767371462975
- type: nauc_recall_at_1_diff1
value: 49.94952823753276
- type: nauc_recall_at_1_max
value: 13.770377574254548
- type: nauc_recall_at_1_std
value: -14.946357968858653
- type: nauc_recall_at_20_diff1
value: 30.040111045267963
- type: nauc_recall_at_20_max
value: 25.984919302418184
- type: nauc_recall_at_20_std
value: -1.4998001817460804
- type: nauc_recall_at_3_diff1
value: 42.24410559113653
- type: nauc_recall_at_3_max
value: 20.269503583626914
- type: nauc_recall_at_3_std
value: -17.09578532600584
- type: nauc_recall_at_5_diff1
value: 36.124149735848945
- type: nauc_recall_at_5_max
value: 22.708022306002622
- type: nauc_recall_at_5_std
value: -16.966976847236193
- type: ndcg_at_1
value: 60.031
- type: ndcg_at_10
value: 60.480000000000004
- type: ndcg_at_100
value: 66.94099999999999
- type: ndcg_at_1000
value: 68.303
- type: ndcg_at_20
value: 63.536
- type: ndcg_at_3
value: 55.903999999999996
- type: ndcg_at_5
value: 57.387
- type: precision_at_1
value: 60.031
- type: precision_at_10
value: 16.682
- type: precision_at_100
value: 2.336
- type: precision_at_1000
value: 0.259
- type: precision_at_20
value: 9.66
- type: precision_at_3
value: 37.191
- type: precision_at_5
value: 27.253
- type: recall_at_1
value: 31.529
- type: recall_at_10
value: 68.035
- type: recall_at_100
value: 90.925
- type: recall_at_1000
value: 98.688
- type: recall_at_20
value: 77.453
- type: recall_at_3
value: 50.221000000000004
- type: recall_at_5
value: 58.209999999999994
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: mteb/hotpotqa
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: main_score
value: 76.67399999999999
- type: map_at_1
value: 43.822
- type: map_at_10
value: 68.82000000000001
- type: map_at_100
value: 69.659
- type: map_at_1000
value: 69.714
- type: map_at_20
value: 69.305
- type: map_at_3
value: 65.517
- type: map_at_5
value: 67.633
- type: mrr_at_1
value: 87.643484132343
- type: mrr_at_10
value: 91.28134679485098
- type: mrr_at_100
value: 91.37985230614755
- type: mrr_at_1000
value: 91.38202467630681
- type: mrr_at_20
value: 91.34718855278429
- type: mrr_at_3
value: 90.75849651136599
- type: mrr_at_5
value: 91.10961062345235
- type: nauc_map_at_1000_diff1
value: 3.7670405082837477
- type: nauc_map_at_1000_max
value: 14.410594409695182
- type: nauc_map_at_1000_std
value: 7.94738583292685
- type: nauc_map_at_100_diff1
value: 3.738796209193936
- type: nauc_map_at_100_max
value: 14.408029101534694
- type: nauc_map_at_100_std
value: 7.979641077687816
- type: nauc_map_at_10_diff1
value: 3.334917978089454
- type: nauc_map_at_10_max
value: 13.975255289147748
- type: nauc_map_at_10_std
value: 7.491959628012161
- type: nauc_map_at_1_diff1
value: 75.35066482050009
- type: nauc_map_at_1_max
value: 53.573503488571475
- type: nauc_map_at_1_std
value: -6.542030594426993
- type: nauc_map_at_20_diff1
value: 3.5197129341582083
- type: nauc_map_at_20_max
value: 14.159880698006816
- type: nauc_map_at_20_std
value: 7.856574384998483
- type: nauc_map_at_3_diff1
value: 3.0992333232864064
- type: nauc_map_at_3_max
value: 12.513959281222112
- type: nauc_map_at_3_std
value: 4.352912866014865
- type: nauc_map_at_5_diff1
value: 3.0351688998572537
- type: nauc_map_at_5_max
value: 13.21599457624529
- type: nauc_map_at_5_std
value: 6.246882983214777
- type: nauc_mrr_at_1000_diff1
value: 75.23953736361132
- type: nauc_mrr_at_1000_max
value: 56.64260717262164
- type: nauc_mrr_at_1000_std
value: -4.865932053762276
- type: nauc_mrr_at_100_diff1
value: 75.24091372816497
- type: nauc_mrr_at_100_max
value: 56.64831104504846
- type: nauc_mrr_at_100_std
value: -4.850966297943324
- type: nauc_mrr_at_10_diff1
value: 75.26540178053416
- type: nauc_mrr_at_10_max
value: 56.828755673428965
- type: nauc_mrr_at_10_std
value: -4.8401126970944635
- type: nauc_mrr_at_1_diff1
value: 75.35066482050009
- type: nauc_mrr_at_1_max
value: 53.573503488571475
- type: nauc_mrr_at_1_std
value: -6.542030594426993
- type: nauc_mrr_at_20_diff1
value: 75.24453050729845
- type: nauc_mrr_at_20_max
value: 56.69220588401435
- type: nauc_mrr_at_20_std
value: -4.843700730832108
- type: nauc_mrr_at_3_diff1
value: 74.98411648336175
- type: nauc_mrr_at_3_max
value: 56.766537573537114
- type: nauc_mrr_at_3_std
value: -4.909712671649337
- type: nauc_mrr_at_5_diff1
value: 75.20599020991028
- type: nauc_mrr_at_5_max
value: 56.64236207782237
- type: nauc_mrr_at_5_std
value: -5.208907367513977
- type: nauc_ndcg_at_1000_diff1
value: 11.48307079099774
- type: nauc_ndcg_at_1000_max
value: 20.893326881675176
- type: nauc_ndcg_at_1000_std
value: 10.43489838692119
- type: nauc_ndcg_at_100_diff1
value: 10.395588735754927
- type: nauc_ndcg_at_100_max
value: 20.529573302516912
- type: nauc_ndcg_at_100_std
value: 11.252973083654268
- type: nauc_ndcg_at_10_diff1
value: 8.596739352741972
- type: nauc_ndcg_at_10_max
value: 18.475863682540673
- type: nauc_ndcg_at_10_std
value: 9.175831033463352
- type: nauc_ndcg_at_1_diff1
value: 75.35066482050009
- type: nauc_ndcg_at_1_max
value: 53.573503488571475
- type: nauc_ndcg_at_1_std
value: -6.542030594426993
- type: nauc_ndcg_at_20_diff1
value: 8.998033972471749
- type: nauc_ndcg_at_20_max
value: 18.892085875404522
- type: nauc_ndcg_at_20_std
value: 10.3241608901084
- type: nauc_ndcg_at_3_diff1
value: 8.796384949533579
- type: nauc_ndcg_at_3_max
value: 16.515261419885274
- type: nauc_ndcg_at_3_std
value: 4.081902976576701
- type: nauc_ndcg_at_5_diff1
value: 8.277259464605025
- type: nauc_ndcg_at_5_max
value: 17.163053202909527
- type: nauc_ndcg_at_5_std
value: 6.652669449704474
- type: nauc_precision_at_1000_diff1
value: -3.490556596304827
- type: nauc_precision_at_1000_max
value: 31.0473259001597
- type: nauc_precision_at_1000_std
value: 52.36921397692622
- type: nauc_precision_at_100_diff1
value: -6.420747959222489
- type: nauc_precision_at_100_max
value: 20.555887056005936
- type: nauc_precision_at_100_std
value: 36.119132870798495
- type: nauc_precision_at_10_diff1
value: -6.461726057290426
- type: nauc_precision_at_10_max
value: 12.161081825341915
- type: nauc_precision_at_10_std
value: 17.961318451839993
- type: nauc_precision_at_1_diff1
value: 75.35066482050009
- type: nauc_precision_at_1_max
value: 53.573503488571475
- type: nauc_precision_at_1_std
value: -6.542030594426993
- type: nauc_precision_at_20_diff1
value: -7.361461296416161
- type: nauc_precision_at_20_max
value: 12.663621261696733
- type: nauc_precision_at_20_std
value: 23.312476851670286
- type: nauc_precision_at_3_diff1
value: -3.299056912774522
- type: nauc_precision_at_3_max
value: 9.85602375812038
- type: nauc_precision_at_3_std
value: 6.4962782003155475
- type: nauc_precision_at_5_diff1
value: -5.3155827772027795
- type: nauc_precision_at_5_max
value: 10.32907751171833
- type: nauc_precision_at_5_std
value: 11.384098087196932
- type: nauc_recall_at_1000_diff1
value: -3.4905565963043332
- type: nauc_recall_at_1000_max
value: 31.04732590016041
- type: nauc_recall_at_1000_std
value: 52.36921397692641
- type: nauc_recall_at_100_diff1
value: -6.420747959222586
- type: nauc_recall_at_100_max
value: 20.55588705600596
- type: nauc_recall_at_100_std
value: 36.11913287079825
- type: nauc_recall_at_10_diff1
value: -6.461726057290347
- type: nauc_recall_at_10_max
value: 12.161081825342022
- type: nauc_recall_at_10_std
value: 17.96131845184002
- type: nauc_recall_at_1_diff1
value: 75.35066482050009
- type: nauc_recall_at_1_max
value: 53.573503488571475
- type: nauc_recall_at_1_std
value: -6.542030594426993
- type: nauc_recall_at_20_diff1
value: -7.361461296416054
- type: nauc_recall_at_20_max
value: 12.66362126169679
- type: nauc_recall_at_20_std
value: 23.312476851670382
- type: nauc_recall_at_3_diff1
value: -3.2990569127745886
- type: nauc_recall_at_3_max
value: 9.856023758120296
- type: nauc_recall_at_3_std
value: 6.496278200315444
- type: nauc_recall_at_5_diff1
value: -5.315582777202729
- type: nauc_recall_at_5_max
value: 10.329077511718229
- type: nauc_recall_at_5_std
value: 11.384098087196932
- type: ndcg_at_1
value: 87.643
- type: ndcg_at_10
value: 76.67399999999999
- type: ndcg_at_100
value: 79.462
- type: ndcg_at_1000
value: 80.43599999999999
- type: ndcg_at_20
value: 77.83
- type: ndcg_at_3
value: 72.256
- type: ndcg_at_5
value: 74.789
- type: precision_at_1
value: 87.643
- type: precision_at_10
value: 15.726999999999999
- type: precision_at_100
value: 1.791
- type: precision_at_1000
value: 0.192
- type: precision_at_20
value: 8.236
- type: precision_at_3
value: 45.919
- type: precision_at_5
value: 29.558
- type: recall_at_1
value: 43.822
- type: recall_at_10
value: 78.636
- type: recall_at_100
value: 89.527
- type: recall_at_1000
value: 95.868
- type: recall_at_20
value: 82.363
- type: recall_at_3
value: 68.879
- type: recall_at_5
value: 73.896
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 96.6608
- type: ap
value: 95.14657820401189
- type: ap_weighted
value: 95.14657820401189
- type: f1
value: 96.66029695623422
- type: f1_weighted
value: 96.66029695623423
- type: main_score
value: 96.6608
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: mteb/msmarco
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: main_score
value: 45.217
- type: map_at_1
value: 24.728
- type: map_at_10
value: 37.933
- type: map_at_100
value: 39.074999999999996
- type: map_at_1000
value: 39.115
- type: map_at_20
value: 38.663
- type: map_at_3
value: 33.904
- type: map_at_5
value: 36.217
- type: mrr_at_1
value: 25.44412607449857
- type: mrr_at_10
value: 38.52640196479737
- type: mrr_at_100
value: 39.60462889736067
- type: mrr_at_1000
value: 39.638904296248526
- type: mrr_at_20
value: 39.2234365827559
- type: mrr_at_3
value: 34.59646609360076
- type: mrr_at_5
value: 36.8801337153773
- type: nauc_map_at_1000_diff1
value: 37.645652178132174
- type: nauc_map_at_1000_max
value: 9.953357023361367
- type: nauc_map_at_1000_std
value: -20.800238036721503
- type: nauc_map_at_100_diff1
value: 37.643073495974555
- type: nauc_map_at_100_max
value: 9.95921239641703
- type: nauc_map_at_100_std
value: -20.76517765535793
- type: nauc_map_at_10_diff1
value: 37.44380763335014
- type: nauc_map_at_10_max
value: 9.917273043055342
- type: nauc_map_at_10_std
value: -21.467951225710898
- type: nauc_map_at_1_diff1
value: 41.02118887981969
- type: nauc_map_at_1_max
value: 8.301113449711778
- type: nauc_map_at_1_std
value: -19.436814224415027
- type: nauc_map_at_20_diff1
value: 37.58156586490493
- type: nauc_map_at_20_max
value: 9.972927967610659
- type: nauc_map_at_20_std
value: -20.951374218839387
- type: nauc_map_at_3_diff1
value: 37.67246795684178
- type: nauc_map_at_3_max
value: 9.307031378909478
- type: nauc_map_at_3_std
value: -21.77026217965021
- type: nauc_map_at_5_diff1
value: 37.39086482095963
- type: nauc_map_at_5_max
value: 9.732739107368566
- type: nauc_map_at_5_std
value: -21.8424296893692
- type: nauc_mrr_at_1000_diff1
value: 37.36666719603192
- type: nauc_mrr_at_1000_max
value: 9.79040465289953
- type: nauc_mrr_at_1000_std
value: -20.590147245965568
- type: nauc_mrr_at_100_diff1
value: 37.36560296629318
- type: nauc_mrr_at_100_max
value: 9.798113710672162
- type: nauc_mrr_at_100_std
value: -20.556791838504292
- type: nauc_mrr_at_10_diff1
value: 37.19257605840734
- type: nauc_mrr_at_10_max
value: 9.749429811638063
- type: nauc_mrr_at_10_std
value: -21.206407664327276
- type: nauc_mrr_at_1_diff1
value: 40.98478651095172
- type: nauc_mrr_at_1_max
value: 8.173841799119707
- type: nauc_mrr_at_1_std
value: -19.530027987868017
- type: nauc_mrr_at_20_diff1
value: 37.29973172861245
- type: nauc_mrr_at_20_max
value: 9.815127660001345
- type: nauc_mrr_at_20_std
value: -20.700860112175928
- type: nauc_mrr_at_3_diff1
value: 37.282848009425734
- type: nauc_mrr_at_3_max
value: 9.172741713108193
- type: nauc_mrr_at_3_std
value: -21.563630513502996
- type: nauc_mrr_at_5_diff1
value: 37.08609827303586
- type: nauc_mrr_at_5_max
value: 9.604643424273284
- type: nauc_mrr_at_5_std
value: -21.580110806494094
- type: nauc_ndcg_at_1000_diff1
value: 37.086587020218545
- type: nauc_ndcg_at_1000_max
value: 10.696860688467472
- type: nauc_ndcg_at_1000_std
value: -19.50989939916873
- type: nauc_ndcg_at_100_diff1
value: 37.03794531268128
- type: nauc_ndcg_at_100_max
value: 10.940820719182339
- type: nauc_ndcg_at_100_std
value: -18.28651832370893
- type: nauc_ndcg_at_10_diff1
value: 36.21062857920633
- type: nauc_ndcg_at_10_max
value: 10.845172882571733
- type: nauc_ndcg_at_10_std
value: -21.454301679510106
- type: nauc_ndcg_at_1_diff1
value: 40.98478651095172
- type: nauc_ndcg_at_1_max
value: 8.173841799119707
- type: nauc_ndcg_at_1_std
value: -19.530027987868017
- type: nauc_ndcg_at_20_diff1
value: 36.583262733100526
- type: nauc_ndcg_at_20_max
value: 11.10492720898974
- type: nauc_ndcg_at_20_std
value: -19.41753284137609
- type: nauc_ndcg_at_3_diff1
value: 36.57271365035382
- type: nauc_ndcg_at_3_max
value: 9.56073433062999
- type: nauc_ndcg_at_3_std
value: -22.324263670932915
- type: nauc_ndcg_at_5_diff1
value: 36.09419372820154
- type: nauc_ndcg_at_5_max
value: 10.357384992631271
- type: nauc_ndcg_at_5_std
value: -22.389578276324894
- type: nauc_precision_at_1000_diff1
value: -2.7435338714030597
- type: nauc_precision_at_1000_max
value: 4.302274933383809
- type: nauc_precision_at_1000_std
value: 8.456846348638948
- type: nauc_precision_at_100_diff1
value: 15.149466332615983
- type: nauc_precision_at_100_max
value: 12.501013731673163
- type: nauc_precision_at_100_std
value: 15.909667509021785
- type: nauc_precision_at_10_diff1
value: 28.699788688314214
- type: nauc_precision_at_10_max
value: 13.024586051842347
- type: nauc_precision_at_10_std
value: -19.197658937078703
- type: nauc_precision_at_1_diff1
value: 40.98478651095172
- type: nauc_precision_at_1_max
value: 8.173841799119707
- type: nauc_precision_at_1_std
value: -19.530027987868017
- type: nauc_precision_at_20_diff1
value: 26.519292942353395
- type: nauc_precision_at_20_max
value: 14.389979272056438
- type: nauc_precision_at_20_std
value: -7.030956994938155
- type: nauc_precision_at_3_diff1
value: 32.87913492278213
- type: nauc_precision_at_3_max
value: 9.673660161387776
- type: nauc_precision_at_3_std
value: -23.905612656592172
- type: nauc_precision_at_5_diff1
value: 30.903850113238597
- type: nauc_precision_at_5_max
value: 11.482375434154898
- type: nauc_precision_at_5_std
value: -23.828657095254247
- type: nauc_recall_at_1000_diff1
value: 35.80765639589219
- type: nauc_recall_at_1000_max
value: 50.94532805969448
- type: nauc_recall_at_1000_std
value: 66.79910877083275
- type: nauc_recall_at_100_diff1
value: 34.96182828311028
- type: nauc_recall_at_100_max
value: 21.729699631790556
- type: nauc_recall_at_100_std
value: 23.509439011686474
- type: nauc_recall_at_10_diff1
value: 31.88371369567137
- type: nauc_recall_at_10_max
value: 14.425389702697073
- type: nauc_recall_at_10_std
value: -20.95578001880924
- type: nauc_recall_at_1_diff1
value: 41.02118887981969
- type: nauc_recall_at_1_max
value: 8.301113449711778
- type: nauc_recall_at_1_std
value: -19.436814224415027
- type: nauc_recall_at_20_diff1
value: 32.42718780622455
- type: nauc_recall_at_20_max
value: 16.90686126329399
- type: nauc_recall_at_20_std
value: -9.38158227016737
- type: nauc_recall_at_3_diff1
value: 33.68966646043966
- type: nauc_recall_at_3_max
value: 10.336277419708532
- type: nauc_recall_at_3_std
value: -23.80165869168538
- type: nauc_recall_at_5_diff1
value: 32.26258807452426
- type: nauc_recall_at_5_max
value: 12.303713005399935
- type: nauc_recall_at_5_std
value: -23.87721891164968
- type: ndcg_at_1
value: 25.444
- type: ndcg_at_10
value: 45.217
- type: ndcg_at_100
value: 50.575
- type: ndcg_at_1000
value: 51.519999999999996
- type: ndcg_at_20
value: 47.786
- type: ndcg_at_3
value: 37.067
- type: ndcg_at_5
value: 41.184
- type: precision_at_1
value: 25.444
- type: precision_at_10
value: 7.07
- type: precision_at_100
value: 0.9730000000000001
- type: precision_at_1000
value: 0.106
- type: precision_at_20
value: 4.072
- type: precision_at_3
value: 15.754999999999999
- type: precision_at_5
value: 11.544
- type: recall_at_1
value: 24.728
- type: recall_at_10
value: 67.607
- type: recall_at_100
value: 92.094
- type: recall_at_1000
value: 99.165
- type: recall_at_20
value: 77.529
- type: recall_at_3
value: 45.535
- type: recall_at_5
value: 55.394
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 99.01276789785682
- type: f1
value: 98.9288649250924
- type: f1_weighted
value: 99.01406884928141
- type: main_score
value: 99.01276789785682
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 92.78385772913816
- type: f1
value: 79.78115704297824
- type: f1_weighted
value: 93.90424147486428
- type: main_score
value: 92.78385772913816
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 4672e20407010da34463acc759c162ca9734bca6
metrics:
- type: accuracy
value: 85.83053127101546
- type: f1
value: 82.72036139888232
- type: f1_weighted
value: 85.81759723866098
- type: main_score
value: 85.83053127101546
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8
metrics:
- type: accuracy
value: 90.19838601210489
- type: f1
value: 89.55260197964978
- type: f1_weighted
value: 90.11422965504119
- type: main_score
value: 90.19838601210489
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: main_score
value: 46.866746897607094
- type: v_measure
value: 46.866746897607094
- type: v_measure_std
value: 1.0966477896919726
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: main_score
value: 44.6538827415503
- type: v_measure
value: 44.6538827415503
- type: v_measure_std
value: 1.1649569936599116
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 59042f120c80e8afa9cdbb224f67076cec0fc9a7
metrics:
- type: main_score
value: 33.05449204940555
- type: map
value: 33.05449204940555
- type: mrr
value: 34.32562058439585
- type: nAUC_map_diff1
value: 11.465656013162807
- type: nAUC_map_max
value: -20.400088169502308
- type: nAUC_map_std
value: -2.638964886362445
- type: nAUC_mrr_diff1
value: 10.644290702481207
- type: nAUC_mrr_max
value: -15.304687384645769
- type: nAUC_mrr_std
value: -0.519919931348978
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: mteb/nfcorpus
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: main_score
value: 41.998000000000005
- type: map_at_1
value: 6.907000000000001
- type: map_at_10
value: 16.397000000000002
- type: map_at_100
value: 21.69
- type: map_at_1000
value: 23.652
- type: map_at_20
value: 18.629
- type: map_at_3
value: 11.969000000000001
- type: map_at_5
value: 13.894
- type: mrr_at_1
value: 53.25077399380805
- type: mrr_at_10
value: 61.8561108653988
- type: mrr_at_100
value: 62.42447851935404
- type: mrr_at_1000
value: 62.459626424428095
- type: mrr_at_20
value: 62.287236389990696
- type: mrr_at_3
value: 60.42311661506711
- type: mrr_at_5
value: 61.36738906088753
- type: nauc_map_at_1000_diff1
value: 17.159461939643844
- type: nauc_map_at_1000_max
value: 32.42764938789903
- type: nauc_map_at_1000_std
value: 11.039427848422093
- type: nauc_map_at_100_diff1
value: 19.089532984187503
- type: nauc_map_at_100_max
value: 31.96721085058713
- type: nauc_map_at_100_std
value: 6.947468655726444
- type: nauc_map_at_10_diff1
value: 25.77255342629802
- type: nauc_map_at_10_max
value: 26.163590320961543
- type: nauc_map_at_10_std
value: -5.2588093720998375
- type: nauc_map_at_1_diff1
value: 46.31602607957798
- type: nauc_map_at_1_max
value: 11.807757660801942
- type: nauc_map_at_1_std
value: -13.984889089354317
- type: nauc_map_at_20_diff1
value: 22.308161130465365
- type: nauc_map_at_20_max
value: 29.070587307827722
- type: nauc_map_at_20_std
value: -1.0103056620851558
- type: nauc_map_at_3_diff1
value: 33.580827849617506
- type: nauc_map_at_3_max
value: 17.661630885799042
- type: nauc_map_at_3_std
value: -11.463282544041888
- type: nauc_map_at_5_diff1
value: 30.32603342696912
- type: nauc_map_at_5_max
value: 20.938905485667245
- type: nauc_map_at_5_std
value: -10.537086968155755
- type: nauc_mrr_at_1000_diff1
value: 24.45065397805829
- type: nauc_mrr_at_1000_max
value: 48.17519860927417
- type: nauc_mrr_at_1000_std
value: 30.350767549118903
- type: nauc_mrr_at_100_diff1
value: 24.444061606534486
- type: nauc_mrr_at_100_max
value: 48.1922894212229
- type: nauc_mrr_at_100_std
value: 30.379257816584094
- type: nauc_mrr_at_10_diff1
value: 24.25598717198779
- type: nauc_mrr_at_10_max
value: 48.10437607774264
- type: nauc_mrr_at_10_std
value: 30.090202482685996
- type: nauc_mrr_at_1_diff1
value: 26.907595285201264
- type: nauc_mrr_at_1_max
value: 44.006974050369955
- type: nauc_mrr_at_1_std
value: 26.921001962861062
- type: nauc_mrr_at_20_diff1
value: 24.462771570553738
- type: nauc_mrr_at_20_max
value: 48.264688196799746
- type: nauc_mrr_at_20_std
value: 30.498095141265914
- type: nauc_mrr_at_3_diff1
value: 24.76829388237229
- type: nauc_mrr_at_3_max
value: 48.213758704739924
- type: nauc_mrr_at_3_std
value: 30.1502853918892
- type: nauc_mrr_at_5_diff1
value: 24.476494932330247
- type: nauc_mrr_at_5_max
value: 47.977250552198804
- type: nauc_mrr_at_5_std
value: 29.65248143104835
- type: nauc_ndcg_at_1000_diff1
value: 13.055818920426246
- type: nauc_ndcg_at_1000_max
value: 46.00986444256306
- type: nauc_ndcg_at_1000_std
value: 29.622662054922085
- type: nauc_ndcg_at_100_diff1
value: 12.260551238228816
- type: nauc_ndcg_at_100_max
value: 39.89783048267698
- type: nauc_ndcg_at_100_std
value: 23.806961617956613
- type: nauc_ndcg_at_10_diff1
value: 11.002915931619567
- type: nauc_ndcg_at_10_max
value: 39.79323759244374
- type: nauc_ndcg_at_10_std
value: 23.053072152911046
- type: nauc_ndcg_at_1_diff1
value: 27.560910719974434
- type: nauc_ndcg_at_1_max
value: 41.21084046258119
- type: nauc_ndcg_at_1_std
value: 26.112891742912893
- type: nauc_ndcg_at_20_diff1
value: 10.085854089024496
- type: nauc_ndcg_at_20_max
value: 37.88629173784684
- type: nauc_ndcg_at_20_std
value: 23.17664322248358
- type: nauc_ndcg_at_3_diff1
value: 16.58969583405987
- type: nauc_ndcg_at_3_max
value: 41.282222954101435
- type: nauc_ndcg_at_3_std
value: 21.080670648392747
- type: nauc_ndcg_at_5_diff1
value: 13.893127947909885
- type: nauc_ndcg_at_5_max
value: 40.21188015992804
- type: nauc_ndcg_at_5_std
value: 21.417443978842652
- type: nauc_precision_at_1000_diff1
value: -17.227504530334564
- type: nauc_precision_at_1000_max
value: 3.798554468439066
- type: nauc_precision_at_1000_std
value: 35.73617809452683
- type: nauc_precision_at_100_diff1
value: -17.63388230218776
- type: nauc_precision_at_100_max
value: 15.079399882407094
- type: nauc_precision_at_100_std
value: 41.83698491321226
- type: nauc_precision_at_10_diff1
value: -11.850925959645156
- type: nauc_precision_at_10_max
value: 35.93283968364352
- type: nauc_precision_at_10_std
value: 34.391271855921296
- type: nauc_precision_at_1_diff1
value: 27.730860778824823
- type: nauc_precision_at_1_max
value: 43.97462471516834
- type: nauc_precision_at_1_std
value: 27.491068270978896
- type: nauc_precision_at_20_diff1
value: -14.281328840943347
- type: nauc_precision_at_20_max
value: 29.469099781759006
- type: nauc_precision_at_20_std
value: 38.54703022340941
- type: nauc_precision_at_3_diff1
value: 3.486986910413196
- type: nauc_precision_at_3_max
value: 41.21107780473768
- type: nauc_precision_at_3_std
value: 24.057479124531216
- type: nauc_precision_at_5_diff1
value: -3.0623787872866233
- type: nauc_precision_at_5_max
value: 37.49266386466702
- type: nauc_precision_at_5_std
value: 26.894454268004935
- type: nauc_recall_at_1000_diff1
value: -2.446891864334283
- type: nauc_recall_at_1000_max
value: 23.867293584643377
- type: nauc_recall_at_1000_std
value: 16.34707128224595
- type: nauc_recall_at_100_diff1
value: 4.891133690841179
- type: nauc_recall_at_100_max
value: 24.56727964996522
- type: nauc_recall_at_100_std
value: 9.847212953200797
- type: nauc_recall_at_10_diff1
value: 19.211912363585288
- type: nauc_recall_at_10_max
value: 24.825344777920737
- type: nauc_recall_at_10_std
value: -5.447989195041898
- type: nauc_recall_at_1_diff1
value: 46.31602607957798
- type: nauc_recall_at_1_max
value: 11.807757660801942
- type: nauc_recall_at_1_std
value: -13.984889089354317
- type: nauc_recall_at_20_diff1
value: 12.233372054304805
- type: nauc_recall_at_20_max
value: 22.284108685207148
- type: nauc_recall_at_20_std
value: -4.317138366746209
- type: nauc_recall_at_3_diff1
value: 28.394631527225815
- type: nauc_recall_at_3_max
value: 15.593864852625462
- type: nauc_recall_at_3_std
value: -12.383531804314593
- type: nauc_recall_at_5_diff1
value: 24.457441304950343
- type: nauc_recall_at_5_max
value: 19.080049396281623
- type: nauc_recall_at_5_std
value: -11.879747703626627
- type: ndcg_at_1
value: 51.548
- type: ndcg_at_10
value: 41.998000000000005
- type: ndcg_at_100
value: 39.626
- type: ndcg_at_1000
value: 48.707
- type: ndcg_at_20
value: 40.181
- type: ndcg_at_3
value: 48.06
- type: ndcg_at_5
value: 45.829
- type: precision_at_1
value: 52.941
- type: precision_at_10
value: 31.330999999999996
- type: precision_at_100
value: 10.421
- type: precision_at_1000
value: 2.428
- type: precision_at_20
value: 24.118000000000002
- type: precision_at_3
value: 45.408
- type: precision_at_5
value: 39.938
- type: recall_at_1
value: 6.907000000000001
- type: recall_at_10
value: 20.51
- type: recall_at_100
value: 40.857
- type: recall_at_1000
value: 73.616
- type: recall_at_20
value: 26.52
- type: recall_at_3
value: 13.267999999999999
- type: recall_at_5
value: 16.141
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: mteb/nq
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: main_score
value: 71.8
- type: map_at_1
value: 47.629
- type: map_at_10
value: 64.846
- type: map_at_100
value: 65.40899999999999
- type: map_at_1000
value: 65.416
- type: map_at_20
value: 65.239
- type: map_at_3
value: 61.185
- type: map_at_5
value: 63.583
- type: mrr_at_1
value: 53.15758980301275
- type: mrr_at_10
value: 67.12880961577366
- type: mrr_at_100
value: 67.44006405426018
- type: mrr_at_1000
value: 67.44519150402294
- type: mrr_at_20
value: 67.34317135515428
- type: mrr_at_3
value: 64.5905755117805
- type: mrr_at_5
value: 66.24613750482806
- type: nauc_map_at_1000_diff1
value: 45.73812106517133
- type: nauc_map_at_1000_max
value: 35.21262031755756
- type: nauc_map_at_1000_std
value: -5.549443574026027
- type: nauc_map_at_100_diff1
value: 45.74254652176879
- type: nauc_map_at_100_max
value: 35.22349167515518
- type: nauc_map_at_100_std
value: -5.53697496044773
- type: nauc_map_at_10_diff1
value: 45.62837128377087
- type: nauc_map_at_10_max
value: 35.3261562342222
- type: nauc_map_at_10_std
value: -5.761924414031163
- type: nauc_map_at_1_diff1
value: 48.69187848570499
- type: nauc_map_at_1_max
value: 28.687996096473476
- type: nauc_map_at_1_std
value: -7.518605958272523
- type: nauc_map_at_20_diff1
value: 45.702303442220035
- type: nauc_map_at_20_max
value: 35.30719944705456
- type: nauc_map_at_20_std
value: -5.59505654742681
- type: nauc_map_at_3_diff1
value: 45.376813726832474
- type: nauc_map_at_3_max
value: 34.68452149643597
- type: nauc_map_at_3_std
value: -7.329014950379634
- type: nauc_map_at_5_diff1
value: 45.29528861989316
- type: nauc_map_at_5_max
value: 35.35741440869229
- type: nauc_map_at_5_std
value: -6.028788612259288
- type: nauc_mrr_at_1000_diff1
value: 46.11808147912517
- type: nauc_mrr_at_1000_max
value: 35.59241850411947
- type: nauc_mrr_at_1000_std
value: -3.4072428526109317
- type: nauc_mrr_at_100_diff1
value: 46.121345545514046
- type: nauc_mrr_at_100_max
value: 35.60147795073431
- type: nauc_mrr_at_100_std
value: -3.3965322447588826
- type: nauc_mrr_at_10_diff1
value: 46.0920068210502
- type: nauc_mrr_at_10_max
value: 35.79649987854354
- type: nauc_mrr_at_10_std
value: -3.339624589368137
- type: nauc_mrr_at_1_diff1
value: 49.101364605656194
- type: nauc_mrr_at_1_max
value: 31.500796071482146
- type: nauc_mrr_at_1_std
value: -4.183818500718156
- type: nauc_mrr_at_20_diff1
value: 46.088076630465594
- type: nauc_mrr_at_20_max
value: 35.682131663053205
- type: nauc_mrr_at_20_std
value: -3.35939023178519
- type: nauc_mrr_at_3_diff1
value: 45.47570812708642
- type: nauc_mrr_at_3_max
value: 35.741892517632984
- type: nauc_mrr_at_3_std
value: -4.135335963822013
- type: nauc_mrr_at_5_diff1
value: 45.78903474184014
- type: nauc_mrr_at_5_max
value: 35.91273593700205
- type: nauc_mrr_at_5_std
value: -3.467873421286869
- type: nauc_ndcg_at_1000_diff1
value: 45.5056583000012
- type: nauc_ndcg_at_1000_max
value: 36.34328379251593
- type: nauc_ndcg_at_1000_std
value: -4.0759698229323345
- type: nauc_ndcg_at_100_diff1
value: 45.61918946477166
- type: nauc_ndcg_at_100_max
value: 36.675460335836235
- type: nauc_ndcg_at_100_std
value: -3.6795334726235986
- type: nauc_ndcg_at_10_diff1
value: 45.15343994274541
- type: nauc_ndcg_at_10_max
value: 37.48139242964657
- type: nauc_ndcg_at_10_std
value: -4.287039084554882
- type: nauc_ndcg_at_1_diff1
value: 49.101364605656194
- type: nauc_ndcg_at_1_max
value: 31.500796071482146
- type: nauc_ndcg_at_1_std
value: -4.183818500718156
- type: nauc_ndcg_at_20_diff1
value: 45.310026313402375
- type: nauc_ndcg_at_20_max
value: 37.32177497902133
- type: nauc_ndcg_at_20_std
value: -3.8214360391282587
- type: nauc_ndcg_at_3_diff1
value: 44.27064370528994
- type: nauc_ndcg_at_3_max
value: 36.380294033571396
- type: nauc_ndcg_at_3_std
value: -6.844263370898355
- type: nauc_ndcg_at_5_diff1
value: 44.29933499225583
- type: nauc_ndcg_at_5_max
value: 37.46477041822136
- type: nauc_ndcg_at_5_std
value: -4.866548530467956
- type: nauc_precision_at_1000_diff1
value: -14.666553359142306
- type: nauc_precision_at_1000_max
value: -0.5599759853201481
- type: nauc_precision_at_1000_std
value: 16.8370925526591
- type: nauc_precision_at_100_diff1
value: -11.816251306246278
- type: nauc_precision_at_100_max
value: 2.969819268208207
- type: nauc_precision_at_100_std
value: 18.59422946634747
- type: nauc_precision_at_10_diff1
value: 1.2050200086029401
- type: nauc_precision_at_10_max
value: 17.59930352911209
- type: nauc_precision_at_10_std
value: 13.714495717588985
- type: nauc_precision_at_1_diff1
value: 49.101364605656194
- type: nauc_precision_at_1_max
value: 31.500796071482146
- type: nauc_precision_at_1_std
value: -4.183818500718156
- type: nauc_precision_at_20_diff1
value: -5.263476664822757
- type: nauc_precision_at_20_max
value: 11.42004823600046
- type: nauc_precision_at_20_std
value: 16.510514518664994
- type: nauc_precision_at_3_diff1
value: 20.116460379305828
- type: nauc_precision_at_3_max
value: 31.32235038301311
- type: nauc_precision_at_3_std
value: 2.7486717133871923
- type: nauc_precision_at_5_diff1
value: 9.57451645335723
- type: nauc_precision_at_5_max
value: 25.28449126580587
- type: nauc_precision_at_5_std
value: 9.955736162466767
- type: nauc_recall_at_1000_diff1
value: -21.632253065978794
- type: nauc_recall_at_1000_max
value: 70.14409090958776
- type: nauc_recall_at_1000_std
value: 65.61658090892989
- type: nauc_recall_at_100_diff1
value: 51.83161124806711
- type: nauc_recall_at_100_max
value: 77.49921361841523
- type: nauc_recall_at_100_std
value: 48.352508746719444
- type: nauc_recall_at_10_diff1
value: 39.86695231362791
- type: nauc_recall_at_10_max
value: 50.12029094799474
- type: nauc_recall_at_10_std
value: 0.1650940628131058
- type: nauc_recall_at_1_diff1
value: 48.69187848570499
- type: nauc_recall_at_1_max
value: 28.687996096473476
- type: nauc_recall_at_1_std
value: -7.518605958272523
- type: nauc_recall_at_20_diff1
value: 39.14155398061627
- type: nauc_recall_at_20_max
value: 56.78559423716229
- type: nauc_recall_at_20_std
value: 7.9728224572344075
- type: nauc_recall_at_3_diff1
value: 38.69589523432158
- type: nauc_recall_at_3_max
value: 39.53271258375579
- type: nauc_recall_at_3_std
value: -8.646925065787512
- type: nauc_recall_at_5_diff1
value: 37.45922652959002
- type: nauc_recall_at_5_max
value: 44.4911958995867
- type: nauc_recall_at_5_std
value: -3.5659842556375594
- type: ndcg_at_1
value: 53.15800000000001
- type: ndcg_at_10
value: 71.8
- type: ndcg_at_100
value: 73.85199999999999
- type: ndcg_at_1000
value: 74.017
- type: ndcg_at_20
value: 72.933
- type: ndcg_at_3
value: 65.479
- type: ndcg_at_5
value: 69.182
- type: precision_at_1
value: 53.15800000000001
- type: precision_at_10
value: 10.805
- type: precision_at_100
value: 1.2
- type: precision_at_1000
value: 0.122
- type: precision_at_20
value: 5.694
- type: precision_at_3
value: 28.939999999999998
- type: precision_at_5
value: 19.641000000000002
- type: recall_at_1
value: 47.629
- type: recall_at_10
value: 90.204
- type: recall_at_100
value: 98.66
- type: recall_at_1000
value: 99.874
- type: recall_at_20
value: 94.24
- type: recall_at_3
value: 74.394
- type: recall_at_5
value: 82.711
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: mteb/quora
config: default
split: test
revision: e4e08e0b7dbe3c8700f0daef558ff32256715259
metrics:
- type: main_score
value: 90.025
- type: map_at_1
value: 72.222
- type: map_at_10
value: 86.58500000000001
- type: map_at_100
value: 87.176
- type: map_at_1000
value: 87.188
- type: map_at_20
value: 86.97399999999999
- type: map_at_3
value: 83.736
- type: map_at_5
value: 85.554
- type: mrr_at_1
value: 83.04
- type: mrr_at_10
value: 89.05599603174585
- type: mrr_at_100
value: 89.12398891419457
- type: mrr_at_1000
value: 89.12434072241001
- type: mrr_at_20
value: 89.10416280692111
- type: mrr_at_3
value: 88.23833333333312
- type: mrr_at_5
value: 88.82233333333308
- type: nauc_map_at_1000_diff1
value: 78.29348113313218
- type: nauc_map_at_1000_max
value: 32.31386754277228
- type: nauc_map_at_1000_std
value: -50.47543661484052
- type: nauc_map_at_100_diff1
value: 78.29618548618575
- type: nauc_map_at_100_max
value: 32.301475680947846
- type: nauc_map_at_100_std
value: -50.50303428814228
- type: nauc_map_at_10_diff1
value: 78.47383776440803
- type: nauc_map_at_10_max
value: 31.839339990133563
- type: nauc_map_at_10_std
value: -52.832713555976
- type: nauc_map_at_1_diff1
value: 82.46330147467418
- type: nauc_map_at_1_max
value: 23.497664918373538
- type: nauc_map_at_1_std
value: -43.824657665520704
- type: nauc_map_at_20_diff1
value: 78.34772176474422
- type: nauc_map_at_20_max
value: 32.16495182893947
- type: nauc_map_at_20_std
value: -51.503292726558605
- type: nauc_map_at_3_diff1
value: 79.07823813069432
- type: nauc_map_at_3_max
value: 29.395911687513976
- type: nauc_map_at_3_std
value: -54.16377546873304
- type: nauc_map_at_5_diff1
value: 78.73076619520454
- type: nauc_map_at_5_max
value: 30.700453118585237
- type: nauc_map_at_5_std
value: -54.130514177664054
- type: nauc_mrr_at_1000_diff1
value: 79.04736184471865
- type: nauc_mrr_at_1000_max
value: 34.43004593837643
- type: nauc_mrr_at_1000_std
value: -46.137269068195316
- type: nauc_mrr_at_100_diff1
value: 79.04698704288086
- type: nauc_mrr_at_100_max
value: 34.4305553741175
- type: nauc_mrr_at_100_std
value: -46.13786687786434
- type: nauc_mrr_at_10_diff1
value: 79.04490677485934
- type: nauc_mrr_at_10_max
value: 34.38170181522227
- type: nauc_mrr_at_10_std
value: -46.38129875681807
- type: nauc_mrr_at_1_diff1
value: 79.87159215719124
- type: nauc_mrr_at_1_max
value: 34.05882339253136
- type: nauc_mrr_at_1_std
value: -43.56093395137571
- type: nauc_mrr_at_20_diff1
value: 79.04384174535653
- type: nauc_mrr_at_20_max
value: 34.442136494675005
- type: nauc_mrr_at_20_std
value: -46.205458519638654
- type: nauc_mrr_at_3_diff1
value: 78.78154519155487
- type: nauc_mrr_at_3_max
value: 34.74995000500305
- type: nauc_mrr_at_3_std
value: -46.36264203155416
- type: nauc_mrr_at_5_diff1
value: 79.02631187177
- type: nauc_mrr_at_5_max
value: 34.538698249632205
- type: nauc_mrr_at_5_std
value: -46.468881576157465
- type: nauc_ndcg_at_1000_diff1
value: 78.25260097014645
- type: nauc_ndcg_at_1000_max
value: 33.68584498704271
- type: nauc_ndcg_at_1000_std
value: -48.44716779494868
- type: nauc_ndcg_at_100_diff1
value: 78.25115412256716
- type: nauc_ndcg_at_100_max
value: 33.63652663447088
- type: nauc_ndcg_at_100_std
value: -48.489243909024715
- type: nauc_ndcg_at_10_diff1
value: 78.23875101557334
- type: nauc_ndcg_at_10_max
value: 32.65217430043823
- type: nauc_ndcg_at_10_std
value: -52.57770468845309
- type: nauc_ndcg_at_1_diff1
value: 79.87159215719124
- type: nauc_ndcg_at_1_max
value: 34.05882339253136
- type: nauc_ndcg_at_1_std
value: -43.56093395137571
- type: nauc_ndcg_at_20_diff1
value: 78.23478552311765
- type: nauc_ndcg_at_20_max
value: 33.30691737901109
- type: nauc_ndcg_at_20_std
value: -50.78412614854527
- type: nauc_ndcg_at_3_diff1
value: 77.66134485470224
- type: nauc_ndcg_at_3_max
value: 32.19504710373125
- type: nauc_ndcg_at_3_std
value: -52.01636728550155
- type: nauc_ndcg_at_5_diff1
value: 78.04734137324255
- type: nauc_ndcg_at_5_max
value: 31.94593625591248
- type: nauc_ndcg_at_5_std
value: -53.02169800690546
- type: nauc_precision_at_1000_diff1
value: -45.771948123542636
- type: nauc_precision_at_1000_max
value: -5.182406190477681
- type: nauc_precision_at_1000_std
value: 41.14460438707817
- type: nauc_precision_at_100_diff1
value: -45.64767154261461
- type: nauc_precision_at_100_max
value: -5.046308286851713
- type: nauc_precision_at_100_std
value: 41.07186716587844
- type: nauc_precision_at_10_diff1
value: -42.26779562305825
- type: nauc_precision_at_10_max
value: -1.1264852893323076
- type: nauc_precision_at_10_std
value: 27.62275729822392
- type: nauc_precision_at_1_diff1
value: 79.87159215719124
- type: nauc_precision_at_1_max
value: 34.05882339253136
- type: nauc_precision_at_1_std
value: -43.56093395137571
- type: nauc_precision_at_20_diff1
value: -44.24293221128388
- type: nauc_precision_at_20_max
value: -3.1345628837361867
- type: nauc_precision_at_20_std
value: 34.23625492740366
- type: nauc_precision_at_3_diff1
value: -24.925251389823348
- type: nauc_precision_at_3_max
value: 6.622188833369412
- type: nauc_precision_at_3_std
value: 6.424741786858512
- type: nauc_precision_at_5_diff1
value: -36.1407949990387
- type: nauc_precision_at_5_max
value: 1.7533948968374462
- type: nauc_precision_at_5_std
value: 17.914083278982634
- type: nauc_recall_at_1000_diff1
value: 52.26815466244496
- type: nauc_recall_at_1000_max
value: 69.73611104239443
- type: nauc_recall_at_1000_std
value: 73.18969965863008
- type: nauc_recall_at_100_diff1
value: 70.80557513785271
- type: nauc_recall_at_100_max
value: 33.333440086544556
- type: nauc_recall_at_100_std
value: -38.75992366905504
- type: nauc_recall_at_10_diff1
value: 74.45948457438163
- type: nauc_recall_at_10_max
value: 26.64948512428989
- type: nauc_recall_at_10_std
value: -82.90334292052363
- type: nauc_recall_at_1_diff1
value: 82.46330147467418
- type: nauc_recall_at_1_max
value: 23.497664918373538
- type: nauc_recall_at_1_std
value: -43.824657665520704
- type: nauc_recall_at_20_diff1
value: 73.80140280887753
- type: nauc_recall_at_20_max
value: 30.361616426734965
- type: nauc_recall_at_20_std
value: -81.1418804447414
- type: nauc_recall_at_3_diff1
value: 75.19854736087834
- type: nauc_recall_at_3_max
value: 26.12298005045584
- type: nauc_recall_at_3_std
value: -63.42583714745169
- type: nauc_recall_at_5_diff1
value: 74.16423451950358
- type: nauc_recall_at_5_max
value: 25.552390331018987
- type: nauc_recall_at_5_std
value: -71.15891947773912
- type: ndcg_at_1
value: 83.04
- type: ndcg_at_10
value: 90.025
- type: ndcg_at_100
value: 91.006
- type: ndcg_at_1000
value: 91.061
- type: ndcg_at_20
value: 90.556
- type: ndcg_at_3
value: 87.493
- type: ndcg_at_5
value: 88.955
- type: precision_at_1
value: 83.04
- type: precision_at_10
value: 13.667000000000002
- type: precision_at_100
value: 1.542
- type: precision_at_1000
value: 0.157
- type: precision_at_20
value: 7.221
- type: precision_at_3
value: 38.433
- type: precision_at_5
value: 25.228
- type: recall_at_1
value: 72.222
- type: recall_at_10
value: 96.604
- type: recall_at_100
value: 99.786
- type: recall_at_1000
value: 99.996
- type: recall_at_20
value: 98.253
- type: recall_at_3
value: 89.276
- type: recall_at_5
value: 93.46
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: main_score
value: 72.86492101891123
- type: v_measure
value: 72.86492101891123
- type: v_measure_std
value: 2.778711445144635
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 385e3cb46b4cfa89021f56c4380204149d0efe33
metrics:
- type: main_score
value: 75.27316726548479
- type: v_measure
value: 75.27316726548479
- type: v_measure_std
value: 8.87871936725338
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: mteb/scidocs
config: default
split: test
revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88
metrics:
- type: main_score
value: 26.638
- type: map_at_1
value: 6.128
- type: map_at_10
value: 16.472
- type: map_at_100
value: 19.522000000000002
- type: map_at_1000
value: 19.898
- type: map_at_20
value: 18.098
- type: map_at_3
value: 11.283
- type: map_at_5
value: 13.771
- type: mrr_at_1
value: 30.2
- type: mrr_at_10
value: 42.621150793650735
- type: mrr_at_100
value: 43.740858712021954
- type: mrr_at_1000
value: 43.762699500220904
- type: mrr_at_20
value: 43.383639927753634
- type: mrr_at_3
value: 38.83333333333331
- type: mrr_at_5
value: 41.14833333333326
- type: nauc_map_at_1000_diff1
value: 13.13534664124808
- type: nauc_map_at_1000_max
value: 29.346654566149795
- type: nauc_map_at_1000_std
value: 18.08121186982413
- type: nauc_map_at_100_diff1
value: 13.098072728041538
- type: nauc_map_at_100_max
value: 29.299084480697523
- type: nauc_map_at_100_std
value: 17.961620202918464
- type: nauc_map_at_10_diff1
value: 14.001743720394682
- type: nauc_map_at_10_max
value: 28.04128290996403
- type: nauc_map_at_10_std
value: 13.744481555974716
- type: nauc_map_at_1_diff1
value: 22.1926640424872
- type: nauc_map_at_1_max
value: 21.32609279586034
- type: nauc_map_at_1_std
value: 6.566596302915438
- type: nauc_map_at_20_diff1
value: 13.57313142419664
- type: nauc_map_at_20_max
value: 28.93840146319476
- type: nauc_map_at_20_std
value: 16.50869367365676
- type: nauc_map_at_3_diff1
value: 17.707700541948462
- type: nauc_map_at_3_max
value: 26.058174051376238
- type: nauc_map_at_3_std
value: 9.943924560735267
- type: nauc_map_at_5_diff1
value: 17.11844492157723
- type: nauc_map_at_5_max
value: 27.865247403049388
- type: nauc_map_at_5_std
value: 11.372588172121546
- type: nauc_mrr_at_1000_diff1
value: 21.11248719936198
- type: nauc_mrr_at_1000_max
value: 26.734172102201466
- type: nauc_mrr_at_1000_std
value: 11.766121765437228
- type: nauc_mrr_at_100_diff1
value: 21.107109982277702
- type: nauc_mrr_at_100_max
value: 26.741616065723267
- type: nauc_mrr_at_100_std
value: 11.789802686224208
- type: nauc_mrr_at_10_diff1
value: 20.74108639793207
- type: nauc_mrr_at_10_max
value: 26.920838463358333
- type: nauc_mrr_at_10_std
value: 11.849217361926522
- type: nauc_mrr_at_1_diff1
value: 22.177437860573356
- type: nauc_mrr_at_1_max
value: 21.88074521417754
- type: nauc_mrr_at_1_std
value: 6.776011900101789
- type: nauc_mrr_at_20_diff1
value: 21.126633710175994
- type: nauc_mrr_at_20_max
value: 26.860736480370974
- type: nauc_mrr_at_20_std
value: 11.815411633726338
- type: nauc_mrr_at_3_diff1
value: 21.689245200066466
- type: nauc_mrr_at_3_max
value: 26.187305092831625
- type: nauc_mrr_at_3_std
value: 10.895380313134332
- type: nauc_mrr_at_5_diff1
value: 20.898811082479778
- type: nauc_mrr_at_5_max
value: 26.939217247104036
- type: nauc_mrr_at_5_std
value: 11.77832949822472
- type: nauc_ndcg_at_1000_diff1
value: 13.251184947898546
- type: nauc_ndcg_at_1000_max
value: 30.879594164526146
- type: nauc_ndcg_at_1000_std
value: 23.125206047366625
- type: nauc_ndcg_at_100_diff1
value: 12.549100649053676
- type: nauc_ndcg_at_100_max
value: 30.634680845419123
- type: nauc_ndcg_at_100_std
value: 23.296226055422984
- type: nauc_ndcg_at_10_diff1
value: 14.475144549294322
- type: nauc_ndcg_at_10_max
value: 29.450349815417336
- type: nauc_ndcg_at_10_std
value: 15.94068314781612
- type: nauc_ndcg_at_1_diff1
value: 22.177437860573356
- type: nauc_ndcg_at_1_max
value: 21.88074521417754
- type: nauc_ndcg_at_1_std
value: 6.776011900101789
- type: nauc_ndcg_at_20_diff1
value: 14.173669585802266
- type: nauc_ndcg_at_20_max
value: 30.475890854725
- type: nauc_ndcg_at_20_std
value: 19.863898148221704
- type: nauc_ndcg_at_3_diff1
value: 18.93971261196868
- type: nauc_ndcg_at_3_max
value: 27.3707298720736
- type: nauc_ndcg_at_3_std
value: 11.439810510051224
- type: nauc_ndcg_at_5_diff1
value: 17.89535958094687
- type: nauc_ndcg_at_5_max
value: 29.272740466638425
- type: nauc_ndcg_at_5_std
value: 13.402467626635909
- type: nauc_precision_at_1000_diff1
value: -3.811547048784123
- type: nauc_precision_at_1000_max
value: 22.55165337197117
- type: nauc_precision_at_1000_std
value: 35.98524999650108
- type: nauc_precision_at_100_diff1
value: 0.6474234774922896
- type: nauc_precision_at_100_max
value: 25.06920726527032
- type: nauc_precision_at_100_std
value: 32.31439698982313
- type: nauc_precision_at_10_diff1
value: 7.943127218139508
- type: nauc_precision_at_10_max
value: 28.571937636787197
- type: nauc_precision_at_10_std
value: 18.8472620918488
- type: nauc_precision_at_1_diff1
value: 22.177437860573356
- type: nauc_precision_at_1_max
value: 21.88074521417754
- type: nauc_precision_at_1_std
value: 6.776011900101789
- type: nauc_precision_at_20_diff1
value: 6.981574259607366
- type: nauc_precision_at_20_max
value: 28.986094397038727
- type: nauc_precision_at_20_std
value: 25.83129974001146
- type: nauc_precision_at_3_diff1
value: 17.197490724039355
- type: nauc_precision_at_3_max
value: 29.17569320583099
- type: nauc_precision_at_3_std
value: 13.430554945991846
- type: nauc_precision_at_5_diff1
value: 14.952364330739362
- type: nauc_precision_at_5_max
value: 31.053243354846977
- type: nauc_precision_at_5_std
value: 15.856312752807822
- type: nauc_recall_at_1000_diff1
value: -4.8224253128926975
- type: nauc_recall_at_1000_max
value: 21.3989024429911
- type: nauc_recall_at_1000_std
value: 39.152234275603604
- type: nauc_recall_at_100_diff1
value: 0.11936808422867201
- type: nauc_recall_at_100_max
value: 24.261739241957823
- type: nauc_recall_at_100_std
value: 32.62984573938928
- type: nauc_recall_at_10_diff1
value: 7.851256165018388
- type: nauc_recall_at_10_max
value: 27.936406600938746
- type: nauc_recall_at_10_std
value: 18.683634320636113
- type: nauc_recall_at_1_diff1
value: 22.1926640424872
- type: nauc_recall_at_1_max
value: 21.32609279586034
- type: nauc_recall_at_1_std
value: 6.566596302915438
- type: nauc_recall_at_20_diff1
value: 6.8107211705182165
- type: nauc_recall_at_20_max
value: 28.286284094687787
- type: nauc_recall_at_20_std
value: 25.932013268120862
- type: nauc_recall_at_3_diff1
value: 17.04156818427151
- type: nauc_recall_at_3_max
value: 28.645439108719216
- type: nauc_recall_at_3_std
value: 13.346047828494411
- type: nauc_recall_at_5_diff1
value: 14.906284329771822
- type: nauc_recall_at_5_max
value: 30.58628602415921
- type: nauc_recall_at_5_std
value: 15.755157478191755
- type: ndcg_at_1
value: 30.2
- type: ndcg_at_10
value: 26.638
- type: ndcg_at_100
value: 37.135
- type: ndcg_at_1000
value: 42.576
- type: ndcg_at_20
value: 30.75
- type: ndcg_at_3
value: 24.675
- type: ndcg_at_5
value: 21.836
- type: precision_at_1
value: 30.2
- type: precision_at_10
value: 14.06
- type: precision_at_100
value: 2.904
- type: precision_at_1000
value: 0.42
- type: precision_at_20
value: 9.4
- type: precision_at_3
value: 23.233
- type: precision_at_5
value: 19.439999999999998
- type: recall_at_1
value: 6.128
- type: recall_at_10
value: 28.471999999999998
- type: recall_at_100
value: 58.952000000000005
- type: recall_at_1000
value: 85.137
- type: recall_at_20
value: 38.17
- type: recall_at_3
value: 14.127999999999998
- type: recall_at_5
value: 19.673
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: 20a6d6f312dd54037fe07a32d58e5e168867909d
metrics:
- type: cosine_pearson
value: 86.86608529160739
- type: cosine_spearman
value: 82.88625166203383
- type: euclidean_pearson
value: 84.15494418856142
- type: euclidean_spearman
value: 82.88449294676421
- type: main_score
value: 82.88625166203383
- type: manhattan_pearson
value: 84.39068623474428
- type: manhattan_spearman
value: 82.88065412169463
- type: pearson
value: 86.86608529160739
- type: spearman
value: 82.88625166203383
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cosine_pearson
value: 87.0445014940449
- type: cosine_spearman
value: 80.0880365116599
- type: euclidean_pearson
value: 83.80250772928852
- type: euclidean_spearman
value: 80.0892465260778
- type: main_score
value: 80.0880365116599
- type: manhattan_pearson
value: 83.96793981929336
- type: manhattan_spearman
value: 80.24881789268238
- type: pearson
value: 87.0445014940449
- type: spearman
value: 80.0880365116599
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cosine_pearson
value: 89.33900828959968
- type: cosine_spearman
value: 89.68256358526733
- type: euclidean_pearson
value: 89.29188708262265
- type: euclidean_spearman
value: 89.68204344658601
- type: main_score
value: 89.68256358526733
- type: manhattan_pearson
value: 89.13996588193149
- type: manhattan_spearman
value: 89.61372804425623
- type: pearson
value: 89.33900828959968
- type: spearman
value: 89.68256358526733
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cosine_pearson
value: 86.42029843639123
- type: cosine_spearman
value: 85.0707889220723
- type: euclidean_pearson
value: 85.75114239552562
- type: euclidean_spearman
value: 85.06858160270725
- type: main_score
value: 85.0707889220723
- type: manhattan_pearson
value: 85.86461900459038
- type: manhattan_spearman
value: 85.28671103475605
- type: pearson
value: 86.42029843639123
- type: spearman
value: 85.0707889220723
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cosine_pearson
value: 88.3660081271444
- type: cosine_spearman
value: 89.39375083609528
- type: euclidean_pearson
value: 89.21818482894895
- type: euclidean_spearman
value: 89.39361588875443
- type: main_score
value: 89.39375083609528
- type: manhattan_pearson
value: 89.53535068014057
- type: manhattan_spearman
value: 89.81077130567752
- type: pearson
value: 88.3660081271444
- type: spearman
value: 89.39375083609528
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cosine_pearson
value: 85.60708247171874
- type: cosine_spearman
value: 87.15234952832193
- type: euclidean_pearson
value: 86.21743555548137
- type: euclidean_spearman
value: 87.14450217418016
- type: main_score
value: 87.15234952832193
- type: manhattan_pearson
value: 86.2467748746084
- type: manhattan_spearman
value: 87.2197479717654
- type: pearson
value: 85.60708247171874
- type: spearman
value: 87.15234952832193
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: faeb762787bd10488a50c8b5be4a3b82e411949c
metrics:
- type: cosine_pearson
value: 91.25898556808458
- type: cosine_spearman
value: 91.35372390581641
- type: euclidean_pearson
value: 91.319520321348
- type: euclidean_spearman
value: 91.30821135416925
- type: main_score
value: 91.35372390581641
- type: manhattan_pearson
value: 91.14800959939069
- type: manhattan_spearman
value: 91.09775424245629
- type: pearson
value: 91.25898556808458
- type: spearman
value: 91.35372390581641
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: cosine_pearson
value: 67.61637111515797
- type: cosine_spearman
value: 68.10379096526697
- type: euclidean_pearson
value: 69.2652309491375
- type: euclidean_spearman
value: 68.18436357033228
- type: main_score
value: 68.10379096526697
- type: manhattan_pearson
value: 69.52531340510775
- type: manhattan_spearman
value: 68.17874790391862
- type: pearson
value: 67.61637111515797
- type: spearman
value: 68.10379096526697
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cosine_pearson
value: 87.81592853782297
- type: cosine_spearman
value: 88.2302550329183
- type: euclidean_pearson
value: 88.01165144519526
- type: euclidean_spearman
value: 88.23342148890097
- type: main_score
value: 88.2302550329183
- type: manhattan_pearson
value: 88.148592564938
- type: manhattan_spearman
value: 88.49226317320988
- type: pearson
value: 87.81592853782297
- type: spearman
value: 88.2302550329183
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: main_score
value: 89.196009707431
- type: map
value: 89.196009707431
- type: mrr
value: 97.07198121413808
- type: nAUC_map_diff1
value: -14.066667940115352
- type: nAUC_map_max
value: 49.73702475027407
- type: nAUC_map_std
value: 64.0986775782592
- type: nAUC_mrr_diff1
value: 21.96846389417319
- type: nAUC_mrr_max
value: 86.38341077184032
- type: nAUC_mrr_std
value: 75.38945014727746
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: mteb/scifact
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: main_score
value: 80.08999999999999
- type: map_at_1
value: 63.161
- type: map_at_10
value: 75.163
- type: map_at_100
value: 75.408
- type: map_at_1000
value: 75.409
- type: map_at_20
value: 75.332
- type: map_at_3
value: 71.839
- type: map_at_5
value: 74.32600000000001
- type: mrr_at_1
value: 66.33333333333333
- type: mrr_at_10
value: 75.95978835978836
- type: mrr_at_100
value: 76.15647881281473
- type: mrr_at_1000
value: 76.15736533763744
- type: mrr_at_20
value: 76.08557368557368
- type: mrr_at_3
value: 73.55555555555556
- type: mrr_at_5
value: 75.4888888888889
- type: nauc_map_at_1000_diff1
value: 77.31229383811176
- type: nauc_map_at_1000_max
value: 58.848319058605156
- type: nauc_map_at_1000_std
value: -14.290090263454985
- type: nauc_map_at_100_diff1
value: 77.31325400213969
- type: nauc_map_at_100_max
value: 58.848885054155275
- type: nauc_map_at_100_std
value: -14.285806618869273
- type: nauc_map_at_10_diff1
value: 77.1806705504232
- type: nauc_map_at_10_max
value: 59.02905805134415
- type: nauc_map_at_10_std
value: -14.132954900037467
- type: nauc_map_at_1_diff1
value: 81.03932970557837
- type: nauc_map_at_1_max
value: 49.02073230264529
- type: nauc_map_at_1_std
value: -22.977452975845512
- type: nauc_map_at_20_diff1
value: 77.22581364818562
- type: nauc_map_at_20_max
value: 58.90740400399768
- type: nauc_map_at_20_std
value: -14.245079150986745
- type: nauc_map_at_3_diff1
value: 76.99793243255563
- type: nauc_map_at_3_max
value: 54.9930733886623
- type: nauc_map_at_3_std
value: -19.297708446082407
- type: nauc_map_at_5_diff1
value: 77.1671608360295
- type: nauc_map_at_5_max
value: 57.27757489519526
- type: nauc_map_at_5_std
value: -15.446338357667708
- type: nauc_mrr_at_1000_diff1
value: 77.4806080821202
- type: nauc_mrr_at_1000_max
value: 60.9213776129792
- type: nauc_mrr_at_1000_std
value: -12.139599632228343
- type: nauc_mrr_at_100_diff1
value: 77.48158073865281
- type: nauc_mrr_at_100_max
value: 60.9218657185361
- type: nauc_mrr_at_100_std
value: -12.13532070453677
- type: nauc_mrr_at_10_diff1
value: 77.32428546014407
- type: nauc_mrr_at_10_max
value: 61.018407010343466
- type: nauc_mrr_at_10_std
value: -12.143193773309347
- type: nauc_mrr_at_1_diff1
value: 80.99806778887115
- type: nauc_mrr_at_1_max
value: 59.17855969530095
- type: nauc_mrr_at_1_std
value: -12.30545640831458
- type: nauc_mrr_at_20_diff1
value: 77.3811067653992
- type: nauc_mrr_at_20_max
value: 60.9648880366335
- type: nauc_mrr_at_20_std
value: -12.124066076541853
- type: nauc_mrr_at_3_diff1
value: 77.31304316321959
- type: nauc_mrr_at_3_max
value: 60.75536766404163
- type: nauc_mrr_at_3_std
value: -12.997876030849623
- type: nauc_mrr_at_5_diff1
value: 77.12952864141742
- type: nauc_mrr_at_5_max
value: 60.995943754968685
- type: nauc_mrr_at_5_std
value: -11.353447465605694
- type: nauc_ndcg_at_1000_diff1
value: 76.81788665683746
- type: nauc_ndcg_at_1000_max
value: 60.35947755262391
- type: nauc_ndcg_at_1000_std
value: -12.884942372460362
- type: nauc_ndcg_at_100_diff1
value: 76.87388230365198
- type: nauc_ndcg_at_100_max
value: 60.38813162962434
- type: nauc_ndcg_at_100_std
value: -12.64384717800478
- type: nauc_ndcg_at_10_diff1
value: 75.87713506026317
- type: nauc_ndcg_at_10_max
value: 61.39356554675667
- type: nauc_ndcg_at_10_std
value: -12.144227584144218
- type: nauc_ndcg_at_1_diff1
value: 80.99806778887115
- type: nauc_ndcg_at_1_max
value: 59.17855969530095
- type: nauc_ndcg_at_1_std
value: -12.30545640831458
- type: nauc_ndcg_at_20_diff1
value: 76.09913944506627
- type: nauc_ndcg_at_20_max
value: 61.01644448834147
- type: nauc_ndcg_at_20_std
value: -12.456209267623857
- type: nauc_ndcg_at_3_diff1
value: 75.52717946614608
- type: nauc_ndcg_at_3_max
value: 58.96433090721983
- type: nauc_ndcg_at_3_std
value: -15.849280494339556
- type: nauc_ndcg_at_5_diff1
value: 75.69026981016921
- type: nauc_ndcg_at_5_max
value: 58.924044405851326
- type: nauc_ndcg_at_5_std
value: -13.182728827923107
- type: nauc_precision_at_1000_diff1
value: -31.634022001609914
- type: nauc_precision_at_1000_max
value: 31.46271490784504
- type: nauc_precision_at_1000_std
value: 60.44801276891442
- type: nauc_precision_at_100_diff1
value: -29.722363469948103
- type: nauc_precision_at_100_max
value: 32.05464592020074
- type: nauc_precision_at_100_std
value: 60.832570595613554
- type: nauc_precision_at_10_diff1
value: -11.91731376599939
- type: nauc_precision_at_10_max
value: 45.43646553157129
- type: nauc_precision_at_10_std
value: 52.962408871791276
- type: nauc_precision_at_1_diff1
value: 80.99806778887115
- type: nauc_precision_at_1_max
value: 59.17855969530095
- type: nauc_precision_at_1_std
value: -12.30545640831458
- type: nauc_precision_at_20_diff1
value: -18.43293701721667
- type: nauc_precision_at_20_max
value: 39.53434874203934
- type: nauc_precision_at_20_std
value: 53.6291982468461
- type: nauc_precision_at_3_diff1
value: 30.84789043003892
- type: nauc_precision_at_3_max
value: 55.660727758110376
- type: nauc_precision_at_3_std
value: 17.87243920840355
- type: nauc_precision_at_5_diff1
value: 4.099395181445625
- type: nauc_precision_at_5_max
value: 50.346770968709386
- type: nauc_precision_at_5_std
value: 44.66722483255029
- type: nauc_recall_at_1000_diff1
value: .nan
- type: nauc_recall_at_1000_max
value: .nan
- type: nauc_recall_at_1000_std
value: .nan
- type: nauc_recall_at_100_diff1
value: 100.0
- type: nauc_recall_at_100_max
value: 72.2222222222207
- type: nauc_recall_at_100_std
value: 86.92810457516407
- type: nauc_recall_at_10_diff1
value: 62.18887555022005
- type: nauc_recall_at_10_max
value: 75.14339068960916
- type: nauc_recall_at_10_std
value: -1.4912631719357108
- type: nauc_recall_at_1_diff1
value: 81.03932970557837
- type: nauc_recall_at_1_max
value: 49.02073230264529
- type: nauc_recall_at_1_std
value: -22.977452975845512
- type: nauc_recall_at_20_diff1
value: 59.27414444038499
- type: nauc_recall_at_20_max
value: 76.32241302318047
- type: nauc_recall_at_20_std
value: -0.8322169447488666
- type: nauc_recall_at_3_diff1
value: 69.58783002593157
- type: nauc_recall_at_3_max
value: 55.89660919896563
- type: nauc_recall_at_3_std
value: -21.183005510917862
- type: nauc_recall_at_5_diff1
value: 65.53660499878802
- type: nauc_recall_at_5_max
value: 58.218018535135805
- type: nauc_recall_at_5_std
value: -8.328952210032455
- type: ndcg_at_1
value: 66.333
- type: ndcg_at_10
value: 80.08999999999999
- type: ndcg_at_100
value: 81.24900000000001
- type: ndcg_at_1000
value: 81.28800000000001
- type: ndcg_at_20
value: 80.625
- type: ndcg_at_3
value: 74.98700000000001
- type: ndcg_at_5
value: 78.553
- type: precision_at_1
value: 66.333
- type: precision_at_10
value: 10.667
- type: precision_at_100
value: 1.127
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_20
value: 5.45
- type: precision_at_3
value: 29.555999999999997
- type: precision_at_5
value: 20.133000000000003
- type: recall_at_1
value: 63.161
- type: recall_at_10
value: 94.167
- type: recall_at_100
value: 99.667
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 96.167
- type: recall_at_3
value: 80.972
- type: recall_at_5
value: 89.90599999999999
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cosine_accuracy
value: 99.81881188118813
- type: cosine_accuracy_threshold
value: 85.55081486701965
- type: cosine_ap
value: 96.0359661816236
- type: cosine_f1
value: 90.6584992343032
- type: cosine_f1_threshold
value: 84.82859134674072
- type: cosine_precision
value: 92.59645464025026
- type: cosine_recall
value: 88.8
- type: dot_accuracy
value: 99.81881188118813
- type: dot_accuracy_threshold
value: 84.91908311843872
- type: dot_ap
value: 96.05740121094365
- type: dot_f1
value: 90.81885856079404
- type: dot_f1_threshold
value: 83.84919166564941
- type: dot_precision
value: 90.14778325123153
- type: dot_recall
value: 91.5
- type: euclidean_accuracy
value: 99.82079207920792
- type: euclidean_accuracy_threshold
value: 54.49706315994263
- type: euclidean_ap
value: 96.03223527068818
- type: euclidean_f1
value: 90.72270630445925
- type: euclidean_f1_threshold
value: 54.49706315994263
- type: euclidean_precision
value: 93.05993690851734
- type: euclidean_recall
value: 88.5
- type: main_score
value: 96.32671902439806
- type: manhattan_accuracy
value: 99.83267326732673
- type: manhattan_accuracy_threshold
value: 3818.192672729492
- type: manhattan_ap
value: 96.32671902439806
- type: manhattan_f1
value: 91.52032112393378
- type: manhattan_f1_threshold
value: 3818.192672729492
- type: manhattan_precision
value: 91.8429003021148
- type: manhattan_recall
value: 91.2
- type: max_ap
value: 96.32671902439806
- type: max_f1
value: 91.52032112393378
- type: max_precision
value: 93.05993690851734
- type: max_recall
value: 91.5
- type: similarity_accuracy
value: 99.81881188118813
- type: similarity_accuracy_threshold
value: 85.55081486701965
- type: similarity_ap
value: 96.0359661816236
- type: similarity_f1
value: 90.6584992343032
- type: similarity_f1_threshold
value: 84.82859134674072
- type: similarity_precision
value: 92.59645464025026
- type: similarity_recall
value: 88.8
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: main_score
value: 80.28558559137414
- type: v_measure
value: 80.28558559137414
- type: v_measure_std
value: 2.795276520287584
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: main_score
value: 49.57135582416209
- type: v_measure
value: 49.57135582416209
- type: v_measure_std
value: 1.6414135468423754
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: main_score
value: 55.253002583598644
- type: map
value: 55.253002583598644
- type: mrr
value: 56.24172396231219
- type: nAUC_map_diff1
value: 40.00053248203427
- type: nAUC_map_max
value: 10.05441740585869
- type: nAUC_map_std
value: 8.227169286387552
- type: nAUC_mrr_diff1
value: 40.250446264233744
- type: nAUC_mrr_max
value: 10.586310195339053
- type: nAUC_mrr_std
value: 8.47326494370076
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cosine_pearson
value: 31.19874648747059
- type: cosine_spearman
value: 31.493550648844863
- type: dot_pearson
value: 31.157847680289407
- type: dot_spearman
value: 31.575299712180538
- type: main_score
value: 31.493550648844863
- type: pearson
value: 31.19874648747059
- type: spearman
value: 31.493550648844863
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: mteb/trec-covid
config: default
split: test
revision: bb9466bac8153a0349341eb1b22e06409e78ef4e
metrics:
- type: main_score
value: 85.983
- type: map_at_1
value: 0.247
- type: map_at_10
value: 2.177
- type: map_at_100
value: 14.804
- type: map_at_1000
value: 37.045
- type: map_at_20
value: 4.12
- type: map_at_3
value: 0.7000000000000001
- type: map_at_5
value: 1.1320000000000001
- type: mrr_at_1
value: 96.0
- type: mrr_at_10
value: 98.0
- type: mrr_at_100
value: 98.0
- type: mrr_at_1000
value: 98.0
- type: mrr_at_20
value: 98.0
- type: mrr_at_3
value: 98.0
- type: mrr_at_5
value: 98.0
- type: nauc_map_at_1000_diff1
value: -0.9165125200337213
- type: nauc_map_at_1000_max
value: 40.260117798042764
- type: nauc_map_at_1000_std
value: 71.72789335831554
- type: nauc_map_at_100_diff1
value: 20.493827311583953
- type: nauc_map_at_100_max
value: 21.005742079276462
- type: nauc_map_at_100_std
value: 62.53815607831659
- type: nauc_map_at_10_diff1
value: 31.289297684528215
- type: nauc_map_at_10_max
value: 7.86554294370268
- type: nauc_map_at_10_std
value: 37.26191657133897
- type: nauc_map_at_1_diff1
value: 25.57568148849456
- type: nauc_map_at_1_max
value: -5.9767435623941445
- type: nauc_map_at_1_std
value: 30.849871717506755
- type: nauc_map_at_20_diff1
value: 30.896018204532087
- type: nauc_map_at_20_max
value: 8.667077299744314
- type: nauc_map_at_20_std
value: 41.512687168412924
- type: nauc_map_at_3_diff1
value: 29.44724521006598
- type: nauc_map_at_3_max
value: 1.597496889532064
- type: nauc_map_at_3_std
value: 32.25013773854697
- type: nauc_map_at_5_diff1
value: 27.387036605618825
- type: nauc_map_at_5_max
value: 5.402983746211454
- type: nauc_map_at_5_std
value: 33.940523962472184
- type: nauc_mrr_at_1000_diff1
value: -14.122315592903503
- type: nauc_mrr_at_1000_max
value: 33.84687208216605
- type: nauc_mrr_at_1000_std
value: 86.11111111111092
- type: nauc_mrr_at_100_diff1
value: -14.122315592903503
- type: nauc_mrr_at_100_max
value: 33.84687208216605
- type: nauc_mrr_at_100_std
value: 86.11111111111092
- type: nauc_mrr_at_10_diff1
value: -14.122315592903503
- type: nauc_mrr_at_10_max
value: 33.84687208216605
- type: nauc_mrr_at_10_std
value: 86.11111111111092
- type: nauc_mrr_at_1_diff1
value: -14.122315592903831
- type: nauc_mrr_at_1_max
value: 33.84687208216637
- type: nauc_mrr_at_1_std
value: 86.11111111111124
- type: nauc_mrr_at_20_diff1
value: -14.122315592903503
- type: nauc_mrr_at_20_max
value: 33.84687208216605
- type: nauc_mrr_at_20_std
value: 86.11111111111092
- type: nauc_mrr_at_3_diff1
value: -14.122315592903503
- type: nauc_mrr_at_3_max
value: 33.84687208216605
- type: nauc_mrr_at_3_std
value: 86.11111111111092
- type: nauc_mrr_at_5_diff1
value: -14.122315592903503
- type: nauc_mrr_at_5_max
value: 33.84687208216605
- type: nauc_mrr_at_5_std
value: 86.11111111111092
- type: nauc_ndcg_at_1000_diff1
value: 8.745907669561928
- type: nauc_ndcg_at_1000_max
value: 45.43307237994533
- type: nauc_ndcg_at_1000_std
value: 74.93357447176336
- type: nauc_ndcg_at_100_diff1
value: -3.9719350773353765
- type: nauc_ndcg_at_100_max
value: 44.43705332397461
- type: nauc_ndcg_at_100_std
value: 61.59493812371758
- type: nauc_ndcg_at_10_diff1
value: 15.230915878367348
- type: nauc_ndcg_at_10_max
value: 48.332840970836635
- type: nauc_ndcg_at_10_std
value: 46.888785065125774
- type: nauc_ndcg_at_1_diff1
value: 13.219732337379442
- type: nauc_ndcg_at_1_max
value: 45.19919078742603
- type: nauc_ndcg_at_1_std
value: 64.68253968253977
- type: nauc_ndcg_at_20_diff1
value: 12.479648691964865
- type: nauc_ndcg_at_20_max
value: 48.76688248450331
- type: nauc_ndcg_at_20_std
value: 51.450399755887545
- type: nauc_ndcg_at_3_diff1
value: 6.165414201871464
- type: nauc_ndcg_at_3_max
value: 45.089689347691035
- type: nauc_ndcg_at_3_std
value: 41.08249161845213
- type: nauc_ndcg_at_5_diff1
value: 7.411245806844721
- type: nauc_ndcg_at_5_max
value: 47.818748093538076
- type: nauc_ndcg_at_5_std
value: 45.907685763676575
- type: nauc_precision_at_1000_diff1
value: -30.574290219847345
- type: nauc_precision_at_1000_max
value: 32.56926126118719
- type: nauc_precision_at_1000_std
value: 14.584504392628874
- type: nauc_precision_at_100_diff1
value: -10.199740234718847
- type: nauc_precision_at_100_max
value: 41.0213226769777
- type: nauc_precision_at_100_std
value: 56.975760776771324
- type: nauc_precision_at_10_diff1
value: 7.865792689701161
- type: nauc_precision_at_10_max
value: 52.00432275201737
- type: nauc_precision_at_10_std
value: 43.89512276413724
- type: nauc_precision_at_1_diff1
value: -14.122315592903831
- type: nauc_precision_at_1_max
value: 33.84687208216637
- type: nauc_precision_at_1_std
value: 86.11111111111124
- type: nauc_precision_at_20_diff1
value: 5.481424191880084
- type: nauc_precision_at_20_max
value: 46.86629331792725
- type: nauc_precision_at_20_std
value: 49.245692667517496
- type: nauc_precision_at_3_diff1
value: -5.870408807869163
- type: nauc_precision_at_3_max
value: 48.73657612128875
- type: nauc_precision_at_3_std
value: 41.15152062088262
- type: nauc_precision_at_5_diff1
value: -4.550610529125413
- type: nauc_precision_at_5_max
value: 60.390115878205386
- type: nauc_precision_at_5_std
value: 44.16494295055696
- type: nauc_recall_at_1000_diff1
value: 8.047794367079034
- type: nauc_recall_at_1000_max
value: 37.07551482870489
- type: nauc_recall_at_1000_std
value: 66.20862163364201
- type: nauc_recall_at_100_diff1
value: 25.08104923597475
- type: nauc_recall_at_100_max
value: 9.971294642165734
- type: nauc_recall_at_100_std
value: 51.737814074891254
- type: nauc_recall_at_10_diff1
value: 32.33148478369628
- type: nauc_recall_at_10_max
value: 1.3767192150014917
- type: nauc_recall_at_10_std
value: 30.801926742876308
- type: nauc_recall_at_1_diff1
value: 25.57568148849456
- type: nauc_recall_at_1_max
value: -5.9767435623941445
- type: nauc_recall_at_1_std
value: 30.849871717506755
- type: nauc_recall_at_20_diff1
value: 31.716580022934654
- type: nauc_recall_at_20_max
value: -0.1281270579464631
- type: nauc_recall_at_20_std
value: 33.76185294993676
- type: nauc_recall_at_3_diff1
value: 29.758810004388348
- type: nauc_recall_at_3_max
value: -1.9442985017191816
- type: nauc_recall_at_3_std
value: 27.45550076962206
- type: nauc_recall_at_5_diff1
value: 27.047710181576672
- type: nauc_recall_at_5_max
value: 1.5237000700880248
- type: nauc_recall_at_5_std
value: 28.235297950159698
- type: ndcg_at_1
value: 94.0
- type: ndcg_at_10
value: 85.983
- type: ndcg_at_100
value: 69.195
- type: ndcg_at_1000
value: 62.541000000000004
- type: ndcg_at_20
value: 83.405
- type: ndcg_at_3
value: 89.98899999999999
- type: ndcg_at_5
value: 87.905
- type: precision_at_1
value: 96.0
- type: precision_at_10
value: 89.4
- type: precision_at_100
value: 71.54
- type: precision_at_1000
value: 27.594
- type: precision_at_20
value: 87.2
- type: precision_at_3
value: 92.667
- type: precision_at_5
value: 90.8
- type: recall_at_1
value: 0.247
- type: recall_at_10
value: 2.315
- type: recall_at_100
value: 17.574
- type: recall_at_1000
value: 59.336999999999996
- type: recall_at_20
value: 4.491
- type: recall_at_3
value: 0.7250000000000001
- type: recall_at_5
value: 1.1820000000000002
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: mteb/touche2020
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: main_score
value: 29.944
- type: map_at_1
value: 3.064
- type: map_at_10
value: 11.501999999999999
- type: map_at_100
value: 18.736
- type: map_at_1000
value: 20.333000000000002
- type: map_at_20
value: 14.057
- type: map_at_3
value: 6.300999999999999
- type: map_at_5
value: 8.463
- type: mrr_at_1
value: 44.89795918367347
- type: mrr_at_10
value: 58.41188856494979
- type: mrr_at_100
value: 58.93964266413245
- type: mrr_at_1000
value: 58.93964266413245
- type: mrr_at_20
value: 58.767485349118
- type: mrr_at_3
value: 54.42176870748299
- type: mrr_at_5
value: 56.666666666666664
- type: nauc_map_at_1000_diff1
value: 11.478593385608479
- type: nauc_map_at_1000_max
value: 10.309889845044324
- type: nauc_map_at_1000_std
value: 21.16721939940238
- type: nauc_map_at_100_diff1
value: 11.570438543562418
- type: nauc_map_at_100_max
value: 8.426183648064834
- type: nauc_map_at_100_std
value: 18.56231985033613
- type: nauc_map_at_10_diff1
value: 22.37735506247481
- type: nauc_map_at_10_max
value: 5.455946239060806
- type: nauc_map_at_10_std
value: -4.2848826518388154
- type: nauc_map_at_1_diff1
value: 27.853645380676824
- type: nauc_map_at_1_max
value: 7.30739948053113
- type: nauc_map_at_1_std
value: -0.2773663157814586
- type: nauc_map_at_20_diff1
value: 14.724669779924648
- type: nauc_map_at_20_max
value: 10.12882779173533
- type: nauc_map_at_20_std
value: 4.4803777672120875
- type: nauc_map_at_3_diff1
value: 31.891173385921263
- type: nauc_map_at_3_max
value: 4.889652271827218
- type: nauc_map_at_3_std
value: -9.477460238651643
- type: nauc_map_at_5_diff1
value: 31.489012040465003
- type: nauc_map_at_5_max
value: 1.7330092417337482
- type: nauc_map_at_5_std
value: -8.137018608469637
- type: nauc_mrr_at_1000_diff1
value: 24.411522237082416
- type: nauc_mrr_at_1000_max
value: 11.286971076556688
- type: nauc_mrr_at_1000_std
value: 23.443174210894043
- type: nauc_mrr_at_100_diff1
value: 24.411522237082416
- type: nauc_mrr_at_100_max
value: 11.286971076556688
- type: nauc_mrr_at_100_std
value: 23.443174210894043
- type: nauc_mrr_at_10_diff1
value: 23.948152308265186
- type: nauc_mrr_at_10_max
value: 12.22420979621155
- type: nauc_mrr_at_10_std
value: 23.557939024705544
- type: nauc_mrr_at_1_diff1
value: 17.902334894536107
- type: nauc_mrr_at_1_max
value: 17.36969662861018
- type: nauc_mrr_at_1_std
value: 19.425714969048734
- type: nauc_mrr_at_20_diff1
value: 24.635893795899797
- type: nauc_mrr_at_20_max
value: 11.330541067194913
- type: nauc_mrr_at_20_std
value: 23.74518583400233
- type: nauc_mrr_at_3_diff1
value: 25.045536328282587
- type: nauc_mrr_at_3_max
value: 7.497967004732733
- type: nauc_mrr_at_3_std
value: 24.167153007320078
- type: nauc_mrr_at_5_diff1
value: 24.328479930592454
- type: nauc_mrr_at_5_max
value: 10.037126854938336
- type: nauc_mrr_at_5_std
value: 25.236208055346136
- type: nauc_ndcg_at_1000_diff1
value: 15.555347444667389
- type: nauc_ndcg_at_1000_max
value: 13.356591700655718
- type: nauc_ndcg_at_1000_std
value: 42.42395845935052
- type: nauc_ndcg_at_100_diff1
value: 13.110526060413708
- type: nauc_ndcg_at_100_max
value: 3.140006440162515
- type: nauc_ndcg_at_100_std
value: 39.02733288398033
- type: nauc_ndcg_at_10_diff1
value: 20.68853369009725
- type: nauc_ndcg_at_10_max
value: 2.435389817058852
- type: nauc_ndcg_at_10_std
value: 10.038202768784316
- type: nauc_ndcg_at_1_diff1
value: 20.17287594582385
- type: nauc_ndcg_at_1_max
value: 12.487205168273196
- type: nauc_ndcg_at_1_std
value: 20.639827614373075
- type: nauc_ndcg_at_20_diff1
value: 16.987577348502985
- type: nauc_ndcg_at_20_max
value: 2.9978717644469266
- type: nauc_ndcg_at_20_std
value: 13.015690866750354
- type: nauc_ndcg_at_3_diff1
value: 32.392223079245575
- type: nauc_ndcg_at_3_max
value: 1.587587110582544
- type: nauc_ndcg_at_3_std
value: 12.850592473446609
- type: nauc_ndcg_at_5_diff1
value: 32.80244517369626
- type: nauc_ndcg_at_5_max
value: 5.8939933777508084
- type: nauc_ndcg_at_5_std
value: 15.779687411463414
- type: nauc_precision_at_1000_diff1
value: -14.314031720452537
- type: nauc_precision_at_1000_max
value: 32.87886666567266
- type: nauc_precision_at_1000_std
value: 21.49347046886851
- type: nauc_precision_at_100_diff1
value: -9.4034008613839
- type: nauc_precision_at_100_max
value: 16.784075123309645
- type: nauc_precision_at_100_std
value: 73.14688535393604
- type: nauc_precision_at_10_diff1
value: 6.855101404043058
- type: nauc_precision_at_10_max
value: 6.52491228645612
- type: nauc_precision_at_10_std
value: 16.104602266016744
- type: nauc_precision_at_1_diff1
value: 17.902334894536107
- type: nauc_precision_at_1_max
value: 17.36969662861018
- type: nauc_precision_at_1_std
value: 19.425714969048734
- type: nauc_precision_at_20_diff1
value: -5.337534613602212
- type: nauc_precision_at_20_max
value: 17.722925454767218
- type: nauc_precision_at_20_std
value: 34.26680462132849
- type: nauc_precision_at_3_diff1
value: 31.054623397809255
- type: nauc_precision_at_3_max
value: -0.92038600946826
- type: nauc_precision_at_3_std
value: 8.326997076862916
- type: nauc_precision_at_5_diff1
value: 29.784942296920462
- type: nauc_precision_at_5_max
value: 6.337469263434779
- type: nauc_precision_at_5_std
value: 12.789597196020974
- type: nauc_recall_at_1000_diff1
value: -3.8177981862041364
- type: nauc_recall_at_1000_max
value: 14.206064332229163
- type: nauc_recall_at_1000_std
value: 74.18853420771269
- type: nauc_recall_at_100_diff1
value: 0.7677996771461106
- type: nauc_recall_at_100_max
value: -4.139924106878441
- type: nauc_recall_at_100_std
value: 48.319930706362896
- type: nauc_recall_at_10_diff1
value: 12.038835537494322
- type: nauc_recall_at_10_max
value: -2.0498983557854418
- type: nauc_recall_at_10_std
value: -2.0339180690854493
- type: nauc_recall_at_1_diff1
value: 27.853645380676824
- type: nauc_recall_at_1_max
value: 7.30739948053113
- type: nauc_recall_at_1_std
value: -0.2773663157814586
- type: nauc_recall_at_20_diff1
value: 0.7907893667756708
- type: nauc_recall_at_20_max
value: 0.8795499810558195
- type: nauc_recall_at_20_std
value: 11.512483291688282
- type: nauc_recall_at_3_diff1
value: 33.19440392639576
- type: nauc_recall_at_3_max
value: -1.5494237697432613
- type: nauc_recall_at_3_std
value: -8.560408808376984
- type: nauc_recall_at_5_diff1
value: 27.42193873870941
- type: nauc_recall_at_5_max
value: -4.74350293281128
- type: nauc_recall_at_5_std
value: -7.618060131179654
- type: ndcg_at_1
value: 42.857
- type: ndcg_at_10
value: 29.944
- type: ndcg_at_100
value: 42.624
- type: ndcg_at_1000
value: 53.384
- type: ndcg_at_20
value: 30.135
- type: ndcg_at_3
value: 34.847
- type: ndcg_at_5
value: 32.573
- type: precision_at_1
value: 44.897999999999996
- type: precision_at_10
value: 25.306
- type: precision_at_100
value: 8.694
- type: precision_at_1000
value: 1.616
- type: precision_at_20
value: 19.082
- type: precision_at_3
value: 34.014
- type: precision_at_5
value: 31.019999999999996
- type: recall_at_1
value: 3.064
- type: recall_at_10
value: 17.849999999999998
- type: recall_at_100
value: 53.217999999999996
- type: recall_at_1000
value: 87.095
- type: recall_at_20
value: 26.111
- type: recall_at_3
value: 7.383000000000001
- type: recall_at_5
value: 11.434
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de
metrics:
- type: accuracy
value: 88.759765625
- type: ap
value: 36.49152357863017
- type: ap_weighted
value: 36.49152357863017
- type: f1
value: 74.4692714448641
- type: f1_weighted
value: 90.54372649306606
- type: main_score
value: 88.759765625
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 74.8443689869836
- type: f1
value: 75.1139662898148
- type: f1_weighted
value: 74.7369003946243
- type: main_score
value: 74.8443689869836
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: main_score
value: 61.42918790942448
- type: v_measure
value: 61.42918790942448
- type: v_measure_std
value: 1.0156550098843082
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cosine_accuracy
value: 88.22197055492639
- type: cosine_accuracy_threshold
value: 83.30042362213135
- type: cosine_ap
value: 80.57754959194938
- type: cosine_f1
value: 73.70579190158894
- type: cosine_f1_threshold
value: 81.04978799819946
- type: cosine_precision
value: 71.64922770303936
- type: cosine_recall
value: 75.8839050131926
- type: dot_accuracy
value: 88.23985217857782
- type: dot_accuracy_threshold
value: 83.31039547920227
- type: dot_ap
value: 80.57533213448181
- type: dot_f1
value: 73.61309601143302
- type: dot_f1_threshold
value: 81.33968114852905
- type: dot_precision
value: 72.51087791144101
- type: dot_recall
value: 74.74934036939314
- type: euclidean_accuracy
value: 88.22197055492639
- type: euclidean_accuracy_threshold
value: 58.290231227874756
- type: euclidean_ap
value: 80.57982723880139
- type: euclidean_f1
value: 73.63426519620417
- type: euclidean_f1_threshold
value: 61.55576705932617
- type: euclidean_precision
value: 71.63173652694611
- type: euclidean_recall
value: 75.75197889182058
- type: main_score
value: 80.57982723880139
- type: manhattan_accuracy
value: 88.14448351910353
- type: manhattan_accuracy_threshold
value: 3907.2471618652344
- type: manhattan_ap
value: 80.3538079655539
- type: manhattan_f1
value: 73.40466675261054
- type: manhattan_f1_threshold
value: 4103.794097900391
- type: manhattan_precision
value: 71.76707839677337
- type: manhattan_recall
value: 75.11873350923483
- type: max_ap
value: 80.57982723880139
- type: max_f1
value: 73.70579190158894
- type: max_precision
value: 72.51087791144101
- type: max_recall
value: 75.8839050131926
- type: similarity_accuracy
value: 88.22197055492639
- type: similarity_accuracy_threshold
value: 83.30042362213135
- type: similarity_ap
value: 80.57754959194938
- type: similarity_f1
value: 73.70579190158894
- type: similarity_f1_threshold
value: 81.04978799819946
- type: similarity_precision
value: 71.64922770303936
- type: similarity_recall
value: 75.8839050131926
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cosine_accuracy
value: 89.88628866379477
- type: cosine_accuracy_threshold
value: 80.8050274848938
- type: cosine_ap
value: 87.57594591596816
- type: cosine_f1
value: 80.0812257707218
- type: cosine_f1_threshold
value: 77.990061044693
- type: cosine_precision
value: 76.93126197063205
- type: cosine_recall
value: 83.50015398829689
- type: dot_accuracy
value: 89.87852679784221
- type: dot_accuracy_threshold
value: 80.84419965744019
- type: dot_ap
value: 87.56136742222151
- type: dot_f1
value: 80.05898617511521
- type: dot_f1_threshold
value: 77.92385816574097
- type: dot_precision
value: 76.80554573106035
- type: dot_recall
value: 83.60024638127503
- type: euclidean_accuracy
value: 89.86882446540149
- type: euclidean_accuracy_threshold
value: 62.08193898200989
- type: euclidean_ap
value: 87.57517549192228
- type: euclidean_f1
value: 80.05286925872892
- type: euclidean_f1_threshold
value: 66.65036082267761
- type: euclidean_precision
value: 76.51063232507545
- type: euclidean_recall
value: 83.93902063443178
- type: main_score
value: 87.64162614197194
- type: manhattan_accuracy
value: 89.8959909962355
- type: manhattan_accuracy_threshold
value: 4176.108169555664
- type: manhattan_ap
value: 87.64162614197194
- type: manhattan_f1
value: 80.17116279069768
- type: manhattan_f1_threshold
value: 4433.153533935547
- type: manhattan_precision
value: 77.57615035644848
- type: manhattan_recall
value: 82.94579611949491
- type: max_ap
value: 87.64162614197194
- type: max_f1
value: 80.17116279069768
- type: max_precision
value: 77.57615035644848
- type: max_recall
value: 83.93902063443178
- type: similarity_accuracy
value: 89.88628866379477
- type: similarity_accuracy_threshold
value: 80.8050274848938
- type: similarity_ap
value: 87.57594591596816
- type: similarity_f1
value: 80.0812257707218
- type: similarity_f1_threshold
value: 77.990061044693
- type: similarity_precision
value: 76.93126197063205
- type: similarity_recall
value: 83.50015398829689
---
# Updates
New open-source models and ToDoList will be listed on https://github.com/DunZhang/Stella/blob/main/news_and_todo.md.
You can also find these models on my [homepage](https://huggingface.co/infgrad).
# Introduction
The models are trained based on `Alibaba-NLP/gte-large-en-v1.5` and `Alibaba-NLP/gte-Qwen2-1.5B-instruct`. Thanks for
their contributions!
**We simplify usage of prompts, providing two prompts for most general tasks, one is for s2p, another one is for s2s.**
Prompt of s2p task(e.g. retrieve task):
```text
Instruct: Given a web search query, retrieve relevant passages that answer the query.\nQuery: {query}
```
Prompt of s2s task(e.g. semantic textual similarity task):
```text
Instruct: Retrieve semantically similar text.\nQuery: {query}
```
The models are finally trained by [MRL]((https://arxiv.org/abs/2205.13147)), so they have multiple dimensions: 512, 768,
1024, 2048, 4096, 6144 and 8192.
The higher the dimension, the better the performance.
**Generally speaking, 1024d is good enough.** The MTEB score of 1024d is only 0.001 lower than 8192d.
# Model directory structure
The model directory structure is very simple, it is a standard SentenceTransformer directory **with a series
of `2_Dense_{dims}`
folders**, where `dims` represents the final vector dimension.
For example, the `2_Dense_256` folder stores Linear weights that convert vector dimensions to 256 dimensions.
Please refer to the following chapters for specific instructions on how to use them.
# Usage
You can use `SentenceTransformers` or `transformers` library to encode text.
## Sentence Transformers
```python
from sentence_transformers import SentenceTransformer
# This model supports two prompts: "s2p_query" and "s2s_query" for sentence-to-passage and sentence-to-sentence tasks, respectively.
# They are defined in `config_sentence_transformers.json`
query_prompt_name = "s2p_query"
queries = [
"What are some ways to reduce stress?",
"What are the benefits of drinking green tea?",
]
# docs do not need any prompts
docs = [
"There are many effective ways to reduce stress. Some common techniques include deep breathing, meditation, and physical activity. Engaging in hobbies, spending time in nature, and connecting with loved ones can also help alleviate stress. Additionally, setting boundaries, practicing self-care, and learning to say no can prevent stress from building up.",
"Green tea has been consumed for centuries and is known for its potential health benefits. It contains antioxidants that may help protect the body against damage caused by free radicals. Regular consumption of green tea has been associated with improved heart health, enhanced cognitive function, and a reduced risk of certain types of cancer. The polyphenols in green tea may also have anti-inflammatory and weight loss properties.",
]
# !The default dimension is 1024, if you need other dimensions, please clone the model and modify `modules.json` to replace `2_Dense_1024` with another dimension, e.g. `2_Dense_256` or `2_Dense_8192` !
model = SentenceTransformer("dunzhang/stella_en_1.5B_v5", trust_remote_code=True).cuda()
query_embeddings = model.encode(queries, prompt_name=query_prompt_name)
doc_embeddings = model.encode(docs)
print(query_embeddings.shape, doc_embeddings.shape)
# (2, 1024) (2, 1024)
similarities = model.similarity(query_embeddings, doc_embeddings)
print(similarities)
# tensor([[0.8179, 0.2958],
# [0.3194, 0.7854]])
```
## Transformers
```python
import os
import torch
from transformers import AutoModel, AutoTokenizer
from sklearn.preprocessing import normalize
query_prompt = "Instruct: Given a web search query, retrieve relevant passages that answer the query.\nQuery: "
queries = [
"What are some ways to reduce stress?",
"What are the benefits of drinking green tea?",
]
queries = [query_prompt + query for query in queries]
# docs do not need any prompts
docs = [
"There are many effective ways to reduce stress. Some common techniques include deep breathing, meditation, and physical activity. Engaging in hobbies, spending time in nature, and connecting with loved ones can also help alleviate stress. Additionally, setting boundaries, practicing self-care, and learning to say no can prevent stress from building up.",
"Green tea has been consumed for centuries and is known for its potential health benefits. It contains antioxidants that may help protect the body against damage caused by free radicals. Regular consumption of green tea has been associated with improved heart health, enhanced cognitive function, and a reduced risk of certain types of cancer. The polyphenols in green tea may also have anti-inflammatory and weight loss properties.",
]
# The path of your model after cloning it
model_dir = "{Your MODEL_PATH}"
vector_dim = 1024
vector_linear_directory = f"2_Dense_{vector_dim}"
model = AutoModel.from_pretrained(model_dir, trust_remote_code=True).cuda().eval()
tokenizer = AutoTokenizer.from_pretrained(model_dir, trust_remote_code=True)
vector_linear = torch.nn.Linear(in_features=model.config.hidden_size, out_features=vector_dim)
vector_linear_dict = {
k.replace("linear.", ""): v for k, v in
torch.load(os.path.join(model_dir, f"{vector_linear_directory}/pytorch_model.bin")).items()
}
vector_linear.load_state_dict(vector_linear_dict)
vector_linear.cuda()
# Embed the queries
with torch.no_grad():
input_data = tokenizer(queries, padding="longest", truncation=True, max_length=512, return_tensors="pt")
input_data = {k: v.cuda() for k, v in input_data.items()}
attention_mask = input_data["attention_mask"]
last_hidden_state = model(**input_data)[0]
last_hidden = last_hidden_state.masked_fill(~attention_mask[..., None].bool(), 0.0)
query_vectors = last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
query_vectors = normalize(vector_linear(query_vectors).cpu().numpy())
# Embed the documents
with torch.no_grad():
input_data = tokenizer(docs, padding="longest", truncation=True, max_length=512, return_tensors="pt")
input_data = {k: v.cuda() for k, v in input_data.items()}
attention_mask = input_data["attention_mask"]
last_hidden_state = model(**input_data)[0]
last_hidden = last_hidden_state.masked_fill(~attention_mask[..., None].bool(), 0.0)
docs_vectors = last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
docs_vectors = normalize(vector_linear(docs_vectors).cpu().numpy())
print(query_vectors.shape, docs_vectors.shape)
# (2, 1024) (2, 1024)
similarities = query_vectors @ docs_vectors.T
print(similarities)
# [[0.8178789 0.2958377 ]
# [0.31938642 0.7853526 ]]
```
# FAQ
Q: The details of training?
A: The training method and datasets will be released in the future. (specific time unknown, may be provided in a paper)
Q: How to choose a suitable prompt for my own task?
A: In most cases, please use the s2p and s2s prompts. These two prompts account for the vast majority of the training
data.
Q: How to reproduce MTEB results?
A: Please use evaluation scripts in `Alibaba-NLP/gte-Qwen2-1.5B-instruct` or `intfloat/e5-mistral-7b-instruct`
Q: Why each dimension has a linear weight?
A: MRL has multiple training methods, we choose this method which has the best performance.
Q: What is the sequence length of models?
A: 512 is recommended, in our experiments, almost all models perform poorly on specialized long text retrieval datasets. Besides, the
model is trained on datasets of 512 length. This may be an optimization term.
If you have any questions, please start a discussion on community. | [
"SUMMARIZATION"
]
| [
"BIOSSES",
"SCIFACT"
]
| Non_BioNLP |
# Updates
New open-source models and ToDoList will be listed on https://github.com/DunZhang/Stella/blob/main/news_and_todo.md.
You can also find these models on my [homepage](https://huggingface.co/infgrad).
# Introduction
The models are trained based on `Alibaba-NLP/gte-large-en-v1.5` and `Alibaba-NLP/gte-Qwen2-1.5B-instruct`. Thanks for
their contributions!
**We simplify usage of prompts, providing two prompts for most general tasks, one is for s2p, another one is for s2s.**
Prompt of s2p task(e.g. retrieve task):
```text
Instruct: Given a web search query, retrieve relevant passages that answer the query.\nQuery: {query}
```
Prompt of s2s task(e.g. semantic textual similarity task):
```text
Instruct: Retrieve semantically similar text.\nQuery: {query}
```
The models are finally trained by [MRL]((https://arxiv.org/abs/2205.13147)), so they have multiple dimensions: 512, 768,
1024, 2048, 4096, 6144 and 8192.
The higher the dimension, the better the performance.
**Generally speaking, 1024d is good enough.** The MTEB score of 1024d is only 0.001 lower than 8192d.
# Model directory structure
The model directory structure is very simple, it is a standard SentenceTransformer directory **with a series
of `2_Dense_{dims}`
folders**, where `dims` represents the final vector dimension.
For example, the `2_Dense_256` folder stores Linear weights that convert vector dimensions to 256 dimensions.
Please refer to the following chapters for specific instructions on how to use them.
# Usage
You can use `SentenceTransformers` or `transformers` library to encode text.
## Sentence Transformers
```python
from sentence_transformers import SentenceTransformer
# This model supports two prompts: "s2p_query" and "s2s_query" for sentence-to-passage and sentence-to-sentence tasks, respectively.
# They are defined in `config_sentence_transformers.json`
query_prompt_name = "s2p_query"
queries = [
"What are some ways to reduce stress?",
"What are the benefits of drinking green tea?",
]
# docs do not need any prompts
docs = [
"There are many effective ways to reduce stress. Some common techniques include deep breathing, meditation, and physical activity. Engaging in hobbies, spending time in nature, and connecting with loved ones can also help alleviate stress. Additionally, setting boundaries, practicing self-care, and learning to say no can prevent stress from building up.",
"Green tea has been consumed for centuries and is known for its potential health benefits. It contains antioxidants that may help protect the body against damage caused by free radicals. Regular consumption of green tea has been associated with improved heart health, enhanced cognitive function, and a reduced risk of certain types of cancer. The polyphenols in green tea may also have anti-inflammatory and weight loss properties.",
]
# !The default dimension is 1024, if you need other dimensions, please clone the model and modify `modules.json` to replace `2_Dense_1024` with another dimension, e.g. `2_Dense_256` or `2_Dense_8192` !
model = SentenceTransformer("dunzhang/stella_en_1.5B_v5", trust_remote_code=True).cuda()
query_embeddings = model.encode(queries, prompt_name=query_prompt_name)
doc_embeddings = model.encode(docs)
print(query_embeddings.shape, doc_embeddings.shape)
# (2, 1024) (2, 1024)
similarities = model.similarity(query_embeddings, doc_embeddings)
print(similarities)
# tensor([[0.8179, 0.2958],
# [0.3194, 0.7854]])
```
## Transformers
```python
import os
import torch
from transformers import AutoModel, AutoTokenizer
from sklearn.preprocessing import normalize
query_prompt = "Instruct: Given a web search query, retrieve relevant passages that answer the query.\nQuery: "
queries = [
"What are some ways to reduce stress?",
"What are the benefits of drinking green tea?",
]
queries = [query_prompt + query for query in queries]
# docs do not need any prompts
docs = [
"There are many effective ways to reduce stress. Some common techniques include deep breathing, meditation, and physical activity. Engaging in hobbies, spending time in nature, and connecting with loved ones can also help alleviate stress. Additionally, setting boundaries, practicing self-care, and learning to say no can prevent stress from building up.",
"Green tea has been consumed for centuries and is known for its potential health benefits. It contains antioxidants that may help protect the body against damage caused by free radicals. Regular consumption of green tea has been associated with improved heart health, enhanced cognitive function, and a reduced risk of certain types of cancer. The polyphenols in green tea may also have anti-inflammatory and weight loss properties.",
]
# The path of your model after cloning it
model_dir = "{Your MODEL_PATH}"
vector_dim = 1024
vector_linear_directory = f"2_Dense_{vector_dim}"
model = AutoModel.from_pretrained(model_dir, trust_remote_code=True).cuda().eval()
tokenizer = AutoTokenizer.from_pretrained(model_dir, trust_remote_code=True)
vector_linear = torch.nn.Linear(in_features=model.config.hidden_size, out_features=vector_dim)
vector_linear_dict = {
k.replace("linear.", ""): v for k, v in
torch.load(os.path.join(model_dir, f"{vector_linear_directory}/pytorch_model.bin")).items()
}
vector_linear.load_state_dict(vector_linear_dict)
vector_linear.cuda()
# Embed the queries
with torch.no_grad():
input_data = tokenizer(queries, padding="longest", truncation=True, max_length=512, return_tensors="pt")
input_data = {k: v.cuda() for k, v in input_data.items()}
attention_mask = input_data["attention_mask"]
last_hidden_state = model(**input_data)[0]
last_hidden = last_hidden_state.masked_fill(~attention_mask[..., None].bool(), 0.0)
query_vectors = last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
query_vectors = normalize(vector_linear(query_vectors).cpu().numpy())
# Embed the documents
with torch.no_grad():
input_data = tokenizer(docs, padding="longest", truncation=True, max_length=512, return_tensors="pt")
input_data = {k: v.cuda() for k, v in input_data.items()}
attention_mask = input_data["attention_mask"]
last_hidden_state = model(**input_data)[0]
last_hidden = last_hidden_state.masked_fill(~attention_mask[..., None].bool(), 0.0)
docs_vectors = last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
docs_vectors = normalize(vector_linear(docs_vectors).cpu().numpy())
print(query_vectors.shape, docs_vectors.shape)
# (2, 1024) (2, 1024)
similarities = query_vectors @ docs_vectors.T
print(similarities)
# [[0.8178789 0.2958377 ]
# [0.31938642 0.7853526 ]]
```
# FAQ
Q: The details of training?
A: The training method and datasets will be released in the future. (specific time unknown, may be provided in a paper)
Q: How to choose a suitable prompt for my own task?
A: In most cases, please use the s2p and s2s prompts. These two prompts account for the vast majority of the training
data.
Q: How to reproduce MTEB results?
A: Please use evaluation scripts in `Alibaba-NLP/gte-Qwen2-1.5B-instruct` or `intfloat/e5-mistral-7b-instruct`
Q: Why each dimension has a linear weight?
A: MRL has multiple training methods, we choose this method which has the best performance.
Q: What is the sequence length of models?
A: 512 is recommended, in our experiments, almost all models perform poorly on specialized long text retrieval datasets. Besides, the
model is trained on datasets of 512 length. This may be an optimization term.
If you have any questions, please start a discussion on community. | {"license": "mit", "tags": ["mteb", "sentence-transformers", "transformers", "sentence-similarity"], "model-index": [{"name": "stella_en_1.5B_v5", "results": [{"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonCounterfactualClassification (en)", "type": "mteb/amazon_counterfactual", "config": "en", "split": "test", "revision": "e8379541af4e31359cca9fbcf4b00f2671dba205"}, "metrics": [{"type": "accuracy", "value": 92.86567164179104}, {"type": "ap", "value": 72.13503907102613}, {"type": "ap_weighted", "value": 72.13503907102613}, {"type": "f1", "value": 89.5586886376355}, {"type": "f1_weighted", "value": 93.13621183004571}, {"type": "main_score", "value": 92.86567164179104}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonPolarityClassification", "type": "mteb/amazon_polarity", "config": "default", "split": "test", "revision": "e2d317d38cd51312af73b3d32a06d1a08b442046"}, "metrics": [{"type": "accuracy", "value": 97.16485}, {"type": "ap", "value": 96.05546315415225}, {"type": "ap_weighted", "value": 96.05546315415225}, {"type": "f1", "value": 97.16351087403213}, {"type": "f1_weighted", "value": 97.16351087403213}, {"type": "main_score", "value": 97.16485}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification (en)", "type": "mteb/amazon_reviews_multi", "config": "en", "split": "test", "revision": "1399c76144fd37290681b995c656ef9b2e06e26d"}, "metrics": [{"type": "accuracy", "value": 59.358}, {"type": "f1", "value": 59.0264615883114}, {"type": "f1_weighted", "value": 59.0264615883114}, {"type": "main_score", "value": 59.358}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ArguAna", "type": "mteb/arguana", "config": "default", "split": "test", "revision": "c22ab2a51041ffd869aaddef7af8d8215647e41a"}, "metrics": [{"type": "main_score", "value": 65.269}, {"type": "map_at_1", "value": 41.607}, {"type": "map_at_10", "value": 57.104}, {"type": "map_at_100", "value": 57.621}, {"type": "map_at_1000", "value": 57.621}, {"type": "map_at_20", "value": 57.533}, {"type": "map_at_3", "value": 52.891999999999996}, {"type": "map_at_5", "value": 55.371}, {"type": "mrr_at_1", "value": 42.318634423897585}, {"type": "mrr_at_10", "value": 57.353970511865406}, {"type": "mrr_at_100", "value": 57.88398078476526}, {"type": "mrr_at_1000", "value": 57.88467807648422}, {"type": "mrr_at_20", "value": 57.796730533206166}, {"type": "mrr_at_3", "value": 53.200568990042775}, {"type": "mrr_at_5", "value": 55.6330014224753}, {"type": "nauc_map_at_1000_diff1", "value": 24.54414600428287}, {"type": "nauc_map_at_1000_max", "value": -8.389738078358459}, {"type": "nauc_map_at_1000_std", "value": -18.188787645801366}, {"type": "nauc_map_at_100_diff1", "value": 24.543138576462308}, {"type": "nauc_map_at_100_max", "value": -8.390896839752044}, {"type": "nauc_map_at_100_std", "value": -18.192549240185247}, {"type": "nauc_map_at_10_diff1", "value": 24.219607088995822}, {"type": "nauc_map_at_10_max", "value": -8.245734391254308}, {"type": "nauc_map_at_10_std", "value": -18.229706566466447}, {"type": "nauc_map_at_1_diff1", "value": 29.325201664812788}, {"type": "nauc_map_at_1_max", "value": -11.742800494823971}, {"type": "nauc_map_at_1_std", "value": -18.610215769702528}, {"type": "nauc_map_at_20_diff1", "value": 24.471097562798803}, {"type": "nauc_map_at_20_max", "value": -8.318035874000799}, {"type": "nauc_map_at_20_std", "value": -18.171541096773108}, {"type": "nauc_map_at_3_diff1", "value": 24.275846107642824}, {"type": "nauc_map_at_3_max", "value": -8.212242049581894}, {"type": "nauc_map_at_3_std", "value": -17.920379368937496}, {"type": "nauc_map_at_5_diff1", "value": 23.873692493209255}, {"type": "nauc_map_at_5_max", "value": -8.110347163828767}, {"type": "nauc_map_at_5_std", "value": -18.20863325596931}, {"type": "nauc_mrr_at_1000_diff1", "value": 22.656410956419975}, {"type": "nauc_mrr_at_1000_max", "value": -8.924888102233243}, {"type": "nauc_mrr_at_1000_std", "value": -18.103674384502526}, {"type": "nauc_mrr_at_100_diff1", "value": 22.655448817140968}, {"type": "nauc_mrr_at_100_max", "value": -8.926034318499038}, {"type": "nauc_mrr_at_100_std", "value": -18.10743930104164}, {"type": "nauc_mrr_at_10_diff1", "value": 22.297536272996872}, {"type": "nauc_mrr_at_10_max", "value": -8.836407556658274}, {"type": "nauc_mrr_at_10_std", "value": -18.1598393044477}, {"type": "nauc_mrr_at_1_diff1", "value": 27.419572424489708}, {"type": "nauc_mrr_at_1_max", "value": -11.42241314820691}, {"type": "nauc_mrr_at_1_std", "value": -18.54893865856313}, {"type": "nauc_mrr_at_20_diff1", "value": 22.590227214657418}, {"type": "nauc_mrr_at_20_max", "value": -8.849986456376993}, {"type": "nauc_mrr_at_20_std", "value": -18.0862391777352}, {"type": "nauc_mrr_at_3_diff1", "value": 22.415270167774988}, {"type": "nauc_mrr_at_3_max", "value": -8.692871854156435}, {"type": "nauc_mrr_at_3_std", "value": -17.6740102891955}, {"type": "nauc_mrr_at_5_diff1", "value": 21.96284578521464}, {"type": "nauc_mrr_at_5_max", "value": -8.757031535546025}, {"type": "nauc_mrr_at_5_std", "value": -18.210766964081294}, {"type": "nauc_ndcg_at_1000_diff1", "value": 23.939400161569115}, {"type": "nauc_ndcg_at_1000_max", "value": -7.866999120512983}, {"type": "nauc_ndcg_at_1000_std", "value": -17.981457019643617}, {"type": "nauc_ndcg_at_100_diff1", "value": 23.920033349619317}, {"type": "nauc_ndcg_at_100_max", "value": -7.889849409678031}, {"type": "nauc_ndcg_at_100_std", "value": -18.054931990360537}, {"type": "nauc_ndcg_at_10_diff1", "value": 22.543020461303534}, {"type": "nauc_ndcg_at_10_max", "value": -7.072111788010867}, {"type": "nauc_ndcg_at_10_std", "value": -18.26397604573537}, {"type": "nauc_ndcg_at_1_diff1", "value": 29.325201664812788}, {"type": "nauc_ndcg_at_1_max", "value": -11.742800494823971}, {"type": "nauc_ndcg_at_1_std", "value": -18.610215769702528}, {"type": "nauc_ndcg_at_20_diff1", "value": 23.551587021207972}, {"type": "nauc_ndcg_at_20_max", "value": -7.298056222649139}, {"type": "nauc_ndcg_at_20_std", "value": -18.056004880930608}, {"type": "nauc_ndcg_at_3_diff1", "value": 22.669089506345273}, {"type": "nauc_ndcg_at_3_max", "value": -7.278024373570137}, {"type": "nauc_ndcg_at_3_std", "value": -17.816657759914193}, {"type": "nauc_ndcg_at_5_diff1", "value": 21.72619728226575}, {"type": "nauc_ndcg_at_5_max", "value": -6.959741647471228}, {"type": "nauc_ndcg_at_5_std", "value": -18.35173705190235}, {"type": "nauc_precision_at_1000_diff1", "value": 5.0388241058076995}, {"type": "nauc_precision_at_1000_max", "value": 34.439879624882145}, {"type": "nauc_precision_at_1000_std", "value": 77.22610895194498}, {"type": "nauc_precision_at_100_diff1", "value": 1.340670767252794}, {"type": "nauc_precision_at_100_max", "value": 19.30870025961241}, {"type": "nauc_precision_at_100_std", "value": 35.37688289157788}, {"type": "nauc_precision_at_10_diff1", "value": 7.734227153124332}, {"type": "nauc_precision_at_10_max", "value": 4.202399088422237}, {"type": "nauc_precision_at_10_std", "value": -18.383890254046698}, {"type": "nauc_precision_at_1_diff1", "value": 29.325201664812788}, {"type": "nauc_precision_at_1_max", "value": -11.742800494823971}, {"type": "nauc_precision_at_1_std", "value": -18.610215769702528}, {"type": "nauc_precision_at_20_diff1", "value": 9.48070999361637}, {"type": "nauc_precision_at_20_max", "value": 19.056709637253025}, {"type": "nauc_precision_at_20_std", "value": -13.266821166159485}, {"type": "nauc_precision_at_3_diff1", "value": 17.245260303409747}, {"type": "nauc_precision_at_3_max", "value": -4.202455033452335}, {"type": "nauc_precision_at_3_std", "value": -17.514264039955332}, {"type": "nauc_precision_at_5_diff1", "value": 12.074628162049974}, {"type": "nauc_precision_at_5_max", "value": -1.9145501461107832}, {"type": "nauc_precision_at_5_std", "value": -19.162525528916344}, {"type": "nauc_recall_at_1000_diff1", "value": 5.038824105805915}, {"type": "nauc_recall_at_1000_max", "value": 34.43987962487738}, {"type": "nauc_recall_at_1000_std", "value": 77.22610895193765}, {"type": "nauc_recall_at_100_diff1", "value": 1.3406707672497025}, {"type": "nauc_recall_at_100_max", "value": 19.30870025960776}, {"type": "nauc_recall_at_100_std", "value": 35.37688289157515}, {"type": "nauc_recall_at_10_diff1", "value": 7.734227153124366}, {"type": "nauc_recall_at_10_max", "value": 4.202399088421976}, {"type": "nauc_recall_at_10_std", "value": -18.38389025404673}, {"type": "nauc_recall_at_1_diff1", "value": 29.325201664812788}, {"type": "nauc_recall_at_1_max", "value": -11.742800494823971}, {"type": "nauc_recall_at_1_std", "value": -18.610215769702528}, {"type": "nauc_recall_at_20_diff1", "value": 9.480709993616845}, {"type": "nauc_recall_at_20_max", "value": 19.05670963725301}, {"type": "nauc_recall_at_20_std", "value": -13.266821166158651}, {"type": "nauc_recall_at_3_diff1", "value": 17.24526030340978}, {"type": "nauc_recall_at_3_max", "value": -4.202455033452323}, {"type": "nauc_recall_at_3_std", "value": -17.51426403995538}, {"type": "nauc_recall_at_5_diff1", "value": 12.074628162049992}, {"type": "nauc_recall_at_5_max", "value": -1.914550146110865}, {"type": "nauc_recall_at_5_std", "value": -19.162525528916362}, {"type": "ndcg_at_1", "value": 41.607}, {"type": "ndcg_at_10", "value": 65.269}, {"type": "ndcg_at_100", "value": 67.289}, {"type": "ndcg_at_1000", "value": 67.29899999999999}, {"type": "ndcg_at_20", "value": 66.76299999999999}, {"type": "ndcg_at_3", "value": 56.604}, {"type": "ndcg_at_5", "value": 61.07900000000001}, {"type": "precision_at_1", "value": 41.607}, {"type": "precision_at_10", "value": 9.118}, {"type": "precision_at_100", "value": 0.996}, {"type": "precision_at_1000", "value": 0.1}, {"type": "precision_at_20", "value": 4.8469999999999995}, {"type": "precision_at_3", "value": 22.451}, {"type": "precision_at_5", "value": 15.647}, {"type": "recall_at_1", "value": 41.607}, {"type": "recall_at_10", "value": 91.181}, {"type": "recall_at_100", "value": 99.57300000000001}, {"type": "recall_at_1000", "value": 99.644}, {"type": "recall_at_20", "value": 96.942}, {"type": "recall_at_3", "value": 67.354}, {"type": "recall_at_5", "value": 78.236}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringP2P", "type": "mteb/arxiv-clustering-p2p", "config": "default", "split": "test", "revision": "a122ad7f3f0291bf49cc6f4d32aa80929df69d5d"}, "metrics": [{"type": "main_score", "value": 55.437138353189994}, {"type": "v_measure", "value": 55.437138353189994}, {"type": "v_measure_std", "value": 14.718556601335491}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringS2S", "type": "mteb/arxiv-clustering-s2s", "config": "default", "split": "test", "revision": "f910caf1a6075f7329cdf8c1a6135696f37dbd53"}, "metrics": [{"type": "main_score", "value": 50.65858459544658}, {"type": "v_measure", "value": 50.65858459544658}, {"type": "v_measure_std", "value": 14.887033747525146}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB AskUbuntuDupQuestions", "type": "mteb/askubuntudupquestions-reranking", "config": "default", "split": "test", "revision": "2000358ca161889fa9c082cb41daa8dcfb161a54"}, "metrics": [{"type": "main_score", "value": 67.32597152838535}, {"type": "map", "value": 67.32597152838535}, {"type": "mrr", "value": 78.98683111286988}, {"type": "nAUC_map_diff1", "value": 16.8624639710487}, {"type": "nAUC_map_max", "value": 24.91996491142433}, {"type": "nAUC_map_std", "value": 17.91865808793225}, {"type": "nAUC_mrr_diff1", "value": 25.03766425631947}, {"type": "nAUC_mrr_max", "value": 41.64561939958336}, {"type": "nAUC_mrr_std", "value": 23.179909345891968}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB BIOSSES", "type": "mteb/biosses-sts", "config": "default", "split": "test", "revision": "d3fb88f8f02e40887cd149695127462bbcf29b4a"}, "metrics": [{"type": "cosine_pearson", "value": 85.790820496042}, {"type": "cosine_spearman", "value": 83.10731534330517}, {"type": "euclidean_pearson", "value": 84.61741304343133}, {"type": "euclidean_spearman", "value": 83.17297949010973}, {"type": "main_score", "value": 83.10731534330517}, {"type": "manhattan_pearson", "value": 85.2137696526676}, {"type": "manhattan_spearman", "value": 84.39168195786738}, {"type": "pearson", "value": 85.790820496042}, {"type": "spearman", "value": 83.10731534330517}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB Banking77Classification", "type": "mteb/banking77", "config": "default", "split": "test", "revision": "0fd18e25b25c072e09e0d92ab615fda904d66300"}, "metrics": [{"type": "accuracy", "value": 89.78896103896105}, {"type": "f1", "value": 89.76107366333488}, {"type": "f1_weighted", "value": 89.76107366333488}, {"type": "main_score", "value": 89.78896103896105}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringP2P", "type": "mteb/biorxiv-clustering-p2p", "config": "default", "split": "test", "revision": "65b79d1d13f80053f67aca9498d9402c2d9f1f40"}, "metrics": [{"type": "main_score", "value": 50.68092296236376}, {"type": "v_measure", "value": 50.68092296236376}, {"type": "v_measure_std", "value": 0.7832640983085436}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringS2S", "type": "mteb/biorxiv-clustering-s2s", "config": "default", "split": "test", "revision": "258694dd0231531bc1fd9de6ceb52a0853c6d908"}, "metrics": [{"type": "main_score", "value": 46.86629236732983}, {"type": "v_measure", "value": 46.86629236732983}, {"type": "v_measure_std", "value": 0.8784322236350974}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackRetrieval", "type": "mteb/cqadupstack", "config": "default", "split": "test", "revision": "4ffe81d471b1924886b33c7567bfb200e9eec5c4"}, "metrics": [{"type": "main_score", "value": 47.74883333333334}, {"type": "map_at_1", "value": 30.179249999999996}, {"type": "map_at_10", "value": 41.60824999999999}, {"type": "map_at_100", "value": 42.94008333333332}, {"type": "map_at_1000", "value": 43.04666666666667}, {"type": "map_at_20", "value": 42.36833333333334}, {"type": "map_at_3", "value": 38.23491666666666}, {"type": "map_at_5", "value": 40.10183333333333}, {"type": "mrr_at_1", "value": 36.47676085808166}, {"type": "mrr_at_10", "value": 46.300991916437155}, {"type": "mrr_at_100", "value": 47.12155753713262}, {"type": "mrr_at_1000", "value": 47.168033610799945}, {"type": "mrr_at_20", "value": 46.80405724560391}, {"type": "mrr_at_3", "value": 43.77000352801797}, {"type": "mrr_at_5", "value": 45.22295361704542}, {"type": "nauc_map_at_1000_diff1", "value": 46.953671666941524}, {"type": "nauc_map_at_1000_max", "value": 32.260396316089675}, {"type": "nauc_map_at_1000_std", "value": 0.6657766120094878}, {"type": "nauc_map_at_100_diff1", "value": 46.94717463394555}, {"type": "nauc_map_at_100_max", "value": 32.25088350678177}, {"type": "nauc_map_at_100_std", "value": 0.6257017014549283}, {"type": "nauc_map_at_10_diff1", "value": 46.974678429336464}, {"type": "nauc_map_at_10_max", "value": 31.862230807295504}, {"type": "nauc_map_at_10_std", "value": -0.14758828549579284}, {"type": "nauc_map_at_1_diff1", "value": 52.48913346466124}, {"type": "nauc_map_at_1_max", "value": 29.874374024967725}, {"type": "nauc_map_at_1_std", "value": -2.433547569836134}, {"type": "nauc_map_at_20_diff1", "value": 46.96088684217651}, {"type": "nauc_map_at_20_max", "value": 32.08954208613205}, {"type": "nauc_map_at_20_std", "value": 0.25946321113436527}, {"type": "nauc_map_at_3_diff1", "value": 47.703230121518345}, {"type": "nauc_map_at_3_max", "value": 30.977880095983107}, {"type": "nauc_map_at_3_std", "value": -1.342777563991804}, {"type": "nauc_map_at_5_diff1", "value": 47.1615010199957}, {"type": "nauc_map_at_5_max", "value": 31.420885812683284}, {"type": "nauc_map_at_5_std", "value": -0.8789297099444306}, {"type": "nauc_mrr_at_1000_diff1", "value": 46.69178645962615}, {"type": "nauc_mrr_at_1000_max", "value": 34.392807413340655}, {"type": "nauc_mrr_at_1000_std", "value": 1.6155464863667934}, {"type": "nauc_mrr_at_100_diff1", "value": 46.67417236349189}, {"type": "nauc_mrr_at_100_max", "value": 34.384607045512624}, {"type": "nauc_mrr_at_100_std", "value": 1.6259917384109652}, {"type": "nauc_mrr_at_10_diff1", "value": 46.60497560446239}, {"type": "nauc_mrr_at_10_max", "value": 34.32918897817958}, {"type": "nauc_mrr_at_10_std", "value": 1.39387793769014}, {"type": "nauc_mrr_at_1_diff1", "value": 51.61608573254137}, {"type": "nauc_mrr_at_1_max", "value": 35.18105023234596}, {"type": "nauc_mrr_at_1_std", "value": 0.17943702145478177}, {"type": "nauc_mrr_at_20_diff1", "value": 46.635943069860254}, {"type": "nauc_mrr_at_20_max", "value": 34.37050973118794}, {"type": "nauc_mrr_at_20_std", "value": 1.5346464678860607}, {"type": "nauc_mrr_at_3_diff1", "value": 47.154389369038334}, {"type": "nauc_mrr_at_3_max", "value": 34.41036411855465}, {"type": "nauc_mrr_at_3_std", "value": 0.924551812357872}, {"type": "nauc_mrr_at_5_diff1", "value": 46.6690101691763}, {"type": "nauc_mrr_at_5_max", "value": 34.29740388138466}, {"type": "nauc_mrr_at_5_std", "value": 1.0567184149139792}, {"type": "nauc_ndcg_at_1000_diff1", "value": 45.375448289173264}, {"type": "nauc_ndcg_at_1000_max", "value": 33.47957083714482}, {"type": "nauc_ndcg_at_1000_std", "value": 3.192251100225568}, {"type": "nauc_ndcg_at_100_diff1", "value": 44.93601014699499}, {"type": "nauc_ndcg_at_100_max", "value": 33.21249888295249}, {"type": "nauc_ndcg_at_100_std", "value": 3.609842852934217}, {"type": "nauc_ndcg_at_10_diff1", "value": 44.87893284011915}, {"type": "nauc_ndcg_at_10_max", "value": 32.384885249478515}, {"type": "nauc_ndcg_at_10_std", "value": 1.454493065035396}, {"type": "nauc_ndcg_at_1_diff1", "value": 51.61608573254137}, {"type": "nauc_ndcg_at_1_max", "value": 35.18105023234596}, {"type": "nauc_ndcg_at_1_std", "value": 0.17943702145478177}, {"type": "nauc_ndcg_at_20_diff1", "value": 44.867752179050605}, {"type": "nauc_ndcg_at_20_max", "value": 32.689535921840196}, {"type": "nauc_ndcg_at_20_std", "value": 2.337765158573901}, {"type": "nauc_ndcg_at_3_diff1", "value": 45.87485821381341}, {"type": "nauc_ndcg_at_3_max", "value": 32.33282450558947}, {"type": "nauc_ndcg_at_3_std", "value": 0.0681643829273283}, {"type": "nauc_ndcg_at_5_diff1", "value": 45.202902131892394}, {"type": "nauc_ndcg_at_5_max", "value": 32.1026971523917}, {"type": "nauc_ndcg_at_5_std", "value": 0.3565572833774486}, {"type": "nauc_precision_at_1000_diff1", "value": -8.935267931198956}, {"type": "nauc_precision_at_1000_max", "value": 6.464981960169269}, {"type": "nauc_precision_at_1000_std", "value": 10.662786182234633}, {"type": "nauc_precision_at_100_diff1", "value": -1.64091517847155}, {"type": "nauc_precision_at_100_max", "value": 15.175617871025024}, {"type": "nauc_precision_at_100_std", "value": 16.924256989248075}, {"type": "nauc_precision_at_10_diff1", "value": 15.676651966277047}, {"type": "nauc_precision_at_10_max", "value": 26.243734188847117}, {"type": "nauc_precision_at_10_std", "value": 10.601741034956333}, {"type": "nauc_precision_at_1_diff1", "value": 51.61608573254137}, {"type": "nauc_precision_at_1_max", "value": 35.18105023234596}, {"type": "nauc_precision_at_1_std", "value": 0.17943702145478177}, {"type": "nauc_precision_at_20_diff1", "value": 9.447267260198654}, {"type": "nauc_precision_at_20_max", "value": 23.024130858142723}, {"type": "nauc_precision_at_20_std", "value": 13.739145648899603}, {"type": "nauc_precision_at_3_diff1", "value": 30.11583572134629}, {"type": "nauc_precision_at_3_max", "value": 31.37321080069495}, {"type": "nauc_precision_at_3_std", "value": 4.705512374126024}, {"type": "nauc_precision_at_5_diff1", "value": 23.192015335996093}, {"type": "nauc_precision_at_5_max", "value": 29.415746835998764}, {"type": "nauc_precision_at_5_std", "value": 6.843498772798558}, {"type": "nauc_recall_at_1000_diff1", "value": 25.36573313426033}, {"type": "nauc_recall_at_1000_max", "value": 43.06672256524168}, {"type": "nauc_recall_at_1000_std", "value": 47.93664853815292}, {"type": "nauc_recall_at_100_diff1", "value": 31.222880916617406}, {"type": "nauc_recall_at_100_max", "value": 31.761159904172658}, {"type": "nauc_recall_at_100_std", "value": 23.034218976635877}, {"type": "nauc_recall_at_10_diff1", "value": 36.23439028915225}, {"type": "nauc_recall_at_10_max", "value": 28.473458977606438}, {"type": "nauc_recall_at_10_std", "value": 3.7797969934159}, {"type": "nauc_recall_at_1_diff1", "value": 52.48913346466124}, {"type": "nauc_recall_at_1_max", "value": 29.874374024967725}, {"type": "nauc_recall_at_1_std", "value": -2.433547569836134}, {"type": "nauc_recall_at_20_diff1", "value": 34.678676952584766}, {"type": "nauc_recall_at_20_max", "value": 29.04638392522168}, {"type": "nauc_recall_at_20_std", "value": 8.148894982082549}, {"type": "nauc_recall_at_3_diff1", "value": 41.31029996231311}, {"type": "nauc_recall_at_3_max", "value": 28.44199443414157}, {"type": "nauc_recall_at_3_std", "value": -0.747324057600377}, {"type": "nauc_recall_at_5_diff1", "value": 38.535873899920674}, {"type": "nauc_recall_at_5_max", "value": 27.942667805948375}, {"type": "nauc_recall_at_5_std", "value": 0.30652206930973686}, {"type": "ndcg_at_1", "value": 36.47675}, {"type": "ndcg_at_10", "value": 47.74883333333334}, {"type": "ndcg_at_100", "value": 52.902416666666674}, {"type": "ndcg_at_1000", "value": 54.69116666666667}, {"type": "ndcg_at_20", "value": 49.89758333333333}, {"type": "ndcg_at_3", "value": 42.462250000000004}, {"type": "ndcg_at_5", "value": 44.91841666666667}, {"type": "precision_at_1", "value": 36.47675}, {"type": "precision_at_10", "value": 8.582416666666665}, {"type": "precision_at_100", "value": 1.31475}, {"type": "precision_at_1000", "value": 0.16458333333333333}, {"type": "precision_at_20", "value": 5.021833333333333}, {"type": "precision_at_3", "value": 20.004499999999997}, {"type": "precision_at_5", "value": 14.178666666666665}, {"type": "recall_at_1", "value": 30.179249999999996}, {"type": "recall_at_10", "value": 60.950166666666675}, {"type": "recall_at_100", "value": 83.19025}, {"type": "recall_at_1000", "value": 95.27774999999998}, {"type": "recall_at_20", "value": 68.80175}, {"type": "recall_at_3", "value": 46.01841666666666}, {"type": "recall_at_5", "value": 52.482416666666666}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ClimateFEVER", "type": "mteb/climate-fever", "config": "default", "split": "test", "revision": "47f2ac6acb640fc46020b02a5b59fdda04d39380"}, "metrics": [{"type": "main_score", "value": 46.113}, {"type": "map_at_1", "value": 20.122999999999998}, {"type": "map_at_10", "value": 35.474}, {"type": "map_at_100", "value": 37.592}, {"type": "map_at_1000", "value": 37.773}, {"type": "map_at_20", "value": 36.637}, {"type": "map_at_3", "value": 29.731}, {"type": "map_at_5", "value": 32.964}, {"type": "mrr_at_1", "value": 46.71009771986971}, {"type": "mrr_at_10", "value": 58.855669303552105}, {"type": "mrr_at_100", "value": 59.389249674038425}, {"type": "mrr_at_1000", "value": 59.408448104362364}, {"type": "mrr_at_20", "value": 59.23881203149016}, {"type": "mrr_at_3", "value": 56.18892508143328}, {"type": "mrr_at_5", "value": 57.85342019543985}, {"type": "nauc_map_at_1000_diff1", "value": 27.047031037721958}, {"type": "nauc_map_at_1000_max", "value": 43.25240279148033}, {"type": "nauc_map_at_1000_std", "value": 20.795849418696037}, {"type": "nauc_map_at_100_diff1", "value": 27.044739015116452}, {"type": "nauc_map_at_100_max", "value": 43.24042159787812}, {"type": "nauc_map_at_100_std", "value": 20.799952124137683}, {"type": "nauc_map_at_10_diff1", "value": 27.372696854670338}, {"type": "nauc_map_at_10_max", "value": 43.054456574721684}, {"type": "nauc_map_at_10_std", "value": 19.537162110136645}, {"type": "nauc_map_at_1_diff1", "value": 43.65424623953092}, {"type": "nauc_map_at_1_max", "value": 45.17986509998762}, {"type": "nauc_map_at_1_std", "value": 8.497107052335414}, {"type": "nauc_map_at_20_diff1", "value": 27.224535846566074}, {"type": "nauc_map_at_20_max", "value": 43.12222854561229}, {"type": "nauc_map_at_20_std", "value": 20.29982972202669}, {"type": "nauc_map_at_3_diff1", "value": 30.87847002319001}, {"type": "nauc_map_at_3_max", "value": 42.890027891707575}, {"type": "nauc_map_at_3_std", "value": 13.857451947580929}, {"type": "nauc_map_at_5_diff1", "value": 27.966867093591542}, {"type": "nauc_map_at_5_max", "value": 42.35826637592201}, {"type": "nauc_map_at_5_std", "value": 16.993102524058624}, {"type": "nauc_mrr_at_1000_diff1", "value": 30.191544077608164}, {"type": "nauc_mrr_at_1000_max", "value": 44.959438920351644}, {"type": "nauc_mrr_at_1000_std", "value": 24.065801376465114}, {"type": "nauc_mrr_at_100_diff1", "value": 30.170368115494}, {"type": "nauc_mrr_at_100_max", "value": 44.955868115761156}, {"type": "nauc_mrr_at_100_std", "value": 24.093510767847707}, {"type": "nauc_mrr_at_10_diff1", "value": 30.128430637520175}, {"type": "nauc_mrr_at_10_max", "value": 44.97689261350708}, {"type": "nauc_mrr_at_10_std", "value": 24.037049561818897}, {"type": "nauc_mrr_at_1_diff1", "value": 35.323351939108214}, {"type": "nauc_mrr_at_1_max", "value": 43.85026244855636}, {"type": "nauc_mrr_at_1_std", "value": 17.040662141218974}, {"type": "nauc_mrr_at_20_diff1", "value": 30.192006556160443}, {"type": "nauc_mrr_at_20_max", "value": 45.02814530774032}, {"type": "nauc_mrr_at_20_std", "value": 24.20885865448696}, {"type": "nauc_mrr_at_3_diff1", "value": 29.88250163424518}, {"type": "nauc_mrr_at_3_max", "value": 44.25768944883186}, {"type": "nauc_mrr_at_3_std", "value": 22.804183393364198}, {"type": "nauc_mrr_at_5_diff1", "value": 30.269824490420767}, {"type": "nauc_mrr_at_5_max", "value": 44.97443265796657}, {"type": "nauc_mrr_at_5_std", "value": 23.894159916141177}, {"type": "nauc_ndcg_at_1000_diff1", "value": 24.533764005407356}, {"type": "nauc_ndcg_at_1000_max", "value": 44.50902713386608}, {"type": "nauc_ndcg_at_1000_std", "value": 27.589506980238404}, {"type": "nauc_ndcg_at_100_diff1", "value": 24.209785073940353}, {"type": "nauc_ndcg_at_100_max", "value": 44.18257063893669}, {"type": "nauc_ndcg_at_100_std", "value": 27.963150866401943}, {"type": "nauc_ndcg_at_10_diff1", "value": 25.168069201989486}, {"type": "nauc_ndcg_at_10_max", "value": 43.84940910683214}, {"type": "nauc_ndcg_at_10_std", "value": 24.810707270956435}, {"type": "nauc_ndcg_at_1_diff1", "value": 35.323351939108214}, {"type": "nauc_ndcg_at_1_max", "value": 43.85026244855636}, {"type": "nauc_ndcg_at_1_std", "value": 17.040662141218974}, {"type": "nauc_ndcg_at_20_diff1", "value": 24.829924800466834}, {"type": "nauc_ndcg_at_20_max", "value": 43.738574327059716}, {"type": "nauc_ndcg_at_20_std", "value": 26.252370278684072}, {"type": "nauc_ndcg_at_3_diff1", "value": 27.321943393906274}, {"type": "nauc_ndcg_at_3_max", "value": 42.16584786993447}, {"type": "nauc_ndcg_at_3_std", "value": 18.24775079455969}, {"type": "nauc_ndcg_at_5_diff1", "value": 26.043785418347998}, {"type": "nauc_ndcg_at_5_max", "value": 42.874593895388344}, {"type": "nauc_ndcg_at_5_std", "value": 21.294004555506117}, {"type": "nauc_precision_at_1000_diff1", "value": -22.073027615308582}, {"type": "nauc_precision_at_1000_max", "value": -6.549723766317357}, {"type": "nauc_precision_at_1000_std", "value": 18.301749191241306}, {"type": "nauc_precision_at_100_diff1", "value": -15.654286887593619}, {"type": "nauc_precision_at_100_max", "value": 6.401516251421999}, {"type": "nauc_precision_at_100_std", "value": 29.170680324929805}, {"type": "nauc_precision_at_10_diff1", "value": -4.362381972892247}, {"type": "nauc_precision_at_10_max", "value": 22.10943515872447}, {"type": "nauc_precision_at_10_std", "value": 31.869699459530022}, {"type": "nauc_precision_at_1_diff1", "value": 35.323351939108214}, {"type": "nauc_precision_at_1_max", "value": 43.85026244855636}, {"type": "nauc_precision_at_1_std", "value": 17.040662141218974}, {"type": "nauc_precision_at_20_diff1", "value": -7.50749661117875}, {"type": "nauc_precision_at_20_max", "value": 16.80584016023257}, {"type": "nauc_precision_at_20_std", "value": 31.976755897112437}, {"type": "nauc_precision_at_3_diff1", "value": 7.402667538773083}, {"type": "nauc_precision_at_3_max", "value": 31.2088401330676}, {"type": "nauc_precision_at_3_std", "value": 24.287905698405662}, {"type": "nauc_precision_at_5_diff1", "value": 0.7479172565343901}, {"type": "nauc_precision_at_5_max", "value": 26.28427734237825}, {"type": "nauc_precision_at_5_std", "value": 28.246947120310317}, {"type": "nauc_recall_at_1000_diff1", "value": 2.4778431086370496}, {"type": "nauc_recall_at_1000_max", "value": 40.2231995797509}, {"type": "nauc_recall_at_1000_std", "value": 52.62124052183862}, {"type": "nauc_recall_at_100_diff1", "value": 8.960962419741463}, {"type": "nauc_recall_at_100_max", "value": 35.81132850291491}, {"type": "nauc_recall_at_100_std", "value": 40.020903251786166}, {"type": "nauc_recall_at_10_diff1", "value": 15.603400751376636}, {"type": "nauc_recall_at_10_max", "value": 37.570127529136485}, {"type": "nauc_recall_at_10_std", "value": 28.07128410238545}, {"type": "nauc_recall_at_1_diff1", "value": 43.65424623953092}, {"type": "nauc_recall_at_1_max", "value": 45.17986509998762}, {"type": "nauc_recall_at_1_std", "value": 8.497107052335414}, {"type": "nauc_recall_at_20_diff1", "value": 13.844820282832346}, {"type": "nauc_recall_at_20_max", "value": 36.0106148516309}, {"type": "nauc_recall_at_20_std", "value": 31.453103910565254}, {"type": "nauc_recall_at_3_diff1", "value": 24.359328154117748}, {"type": "nauc_recall_at_3_max", "value": 39.93774251377568}, {"type": "nauc_recall_at_3_std", "value": 16.214921517509648}, {"type": "nauc_recall_at_5_diff1", "value": 18.75788451360292}, {"type": "nauc_recall_at_5_max", "value": 38.177646107055516}, {"type": "nauc_recall_at_5_std", "value": 22.17196825834675}, {"type": "ndcg_at_1", "value": 46.71}, {"type": "ndcg_at_10", "value": 46.113}, {"type": "ndcg_at_100", "value": 53.035}, {"type": "ndcg_at_1000", "value": 55.724}, {"type": "ndcg_at_20", "value": 48.929}, {"type": "ndcg_at_3", "value": 39.501999999999995}, {"type": "ndcg_at_5", "value": 41.792}, {"type": "precision_at_1", "value": 46.71}, {"type": "precision_at_10", "value": 14.274000000000001}, {"type": "precision_at_100", "value": 2.1870000000000003}, {"type": "precision_at_1000", "value": 0.269}, {"type": "precision_at_20", "value": 8.375}, {"type": "precision_at_3", "value": 29.881}, {"type": "precision_at_5", "value": 22.697}, {"type": "recall_at_1", "value": 20.122999999999998}, {"type": "recall_at_10", "value": 52.22}, {"type": "recall_at_100", "value": 75.388}, {"type": "recall_at_1000", "value": 89.938}, {"type": "recall_at_20", "value": 60.077000000000005}, {"type": "recall_at_3", "value": 35.150999999999996}, {"type": "recall_at_5", "value": 42.748000000000005}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB DBPedia", "type": "mteb/dbpedia", "config": "default", "split": "test", "revision": "c0f706b76e590d620bd6618b3ca8efdd34e2d659"}, "metrics": [{"type": "main_score", "value": 52.276999999999994}, {"type": "map_at_1", "value": 9.949}, {"type": "map_at_10", "value": 24.891}, {"type": "map_at_100", "value": 37.111}, {"type": "map_at_1000", "value": 39.266}, {"type": "map_at_20", "value": 29.685}, {"type": "map_at_3", "value": 16.586000000000002}, {"type": "map_at_5", "value": 19.982}, {"type": "mrr_at_1", "value": 76.25}, {"type": "mrr_at_10", "value": 82.4518849206349}, {"type": "mrr_at_100", "value": 82.70302194564499}, {"type": "mrr_at_1000", "value": 82.70909729942254}, {"type": "mrr_at_20", "value": 82.60492765962964}, {"type": "mrr_at_3", "value": 81.33333333333331}, {"type": "mrr_at_5", "value": 82.14583333333331}, {"type": "nauc_map_at_1000_diff1", "value": 21.427201262456556}, {"type": "nauc_map_at_1000_max", "value": 35.357361590816076}, {"type": "nauc_map_at_1000_std", "value": 24.785419223353717}, {"type": "nauc_map_at_100_diff1", "value": 22.82358692021537}, {"type": "nauc_map_at_100_max", "value": 35.07399692072945}, {"type": "nauc_map_at_100_std", "value": 22.679878828987025}, {"type": "nauc_map_at_10_diff1", "value": 26.491769223479643}, {"type": "nauc_map_at_10_max", "value": 20.78079385443902}, {"type": "nauc_map_at_10_std", "value": -4.910406292079661}, {"type": "nauc_map_at_1_diff1", "value": 35.20851030208876}, {"type": "nauc_map_at_1_max", "value": 5.783003346365858}, {"type": "nauc_map_at_1_std", "value": -21.11679133835354}, {"type": "nauc_map_at_20_diff1", "value": 24.80097499300491}, {"type": "nauc_map_at_20_max", "value": 26.807021360774975}, {"type": "nauc_map_at_20_std", "value": 4.793103995429955}, {"type": "nauc_map_at_3_diff1", "value": 29.238193458890173}, {"type": "nauc_map_at_3_max", "value": 10.300839972189456}, {"type": "nauc_map_at_3_std", "value": -17.889666731981592}, {"type": "nauc_map_at_5_diff1", "value": 28.773624870573926}, {"type": "nauc_map_at_5_max", "value": 14.951435645422887}, {"type": "nauc_map_at_5_std", "value": -13.319697827173565}, {"type": "nauc_mrr_at_1000_diff1", "value": 55.232544856708785}, {"type": "nauc_mrr_at_1000_max", "value": 64.73225637682637}, {"type": "nauc_mrr_at_1000_std", "value": 37.57480399594188}, {"type": "nauc_mrr_at_100_diff1", "value": 55.219251601773735}, {"type": "nauc_mrr_at_100_max", "value": 64.73305063663611}, {"type": "nauc_mrr_at_100_std", "value": 37.56458562909293}, {"type": "nauc_mrr_at_10_diff1", "value": 55.123463838253464}, {"type": "nauc_mrr_at_10_max", "value": 64.91914041040233}, {"type": "nauc_mrr_at_10_std", "value": 37.76482503851598}, {"type": "nauc_mrr_at_1_diff1", "value": 56.45461238513347}, {"type": "nauc_mrr_at_1_max", "value": 63.11782510293676}, {"type": "nauc_mrr_at_1_std", "value": 33.592561284868985}, {"type": "nauc_mrr_at_20_diff1", "value": 55.15401961460458}, {"type": "nauc_mrr_at_20_max", "value": 64.77145835613156}, {"type": "nauc_mrr_at_20_std", "value": 37.471561418305804}, {"type": "nauc_mrr_at_3_diff1", "value": 54.64387438697658}, {"type": "nauc_mrr_at_3_max", "value": 64.27618995019164}, {"type": "nauc_mrr_at_3_std", "value": 39.391637295269014}, {"type": "nauc_mrr_at_5_diff1", "value": 55.08702591239485}, {"type": "nauc_mrr_at_5_max", "value": 64.6071475650635}, {"type": "nauc_mrr_at_5_std", "value": 37.97185134269896}, {"type": "nauc_ndcg_at_1000_diff1", "value": 31.696698876400387}, {"type": "nauc_ndcg_at_1000_max", "value": 52.12183760001191}, {"type": "nauc_ndcg_at_1000_std", "value": 40.197596211778716}, {"type": "nauc_ndcg_at_100_diff1", "value": 33.253120193433666}, {"type": "nauc_ndcg_at_100_max", "value": 49.47167758554746}, {"type": "nauc_ndcg_at_100_std", "value": 32.643833139756204}, {"type": "nauc_ndcg_at_10_diff1", "value": 27.065541392580013}, {"type": "nauc_ndcg_at_10_max", "value": 45.83504281289289}, {"type": "nauc_ndcg_at_10_std", "value": 27.11739500732328}, {"type": "nauc_ndcg_at_1_diff1", "value": 49.42808250022517}, {"type": "nauc_ndcg_at_1_max", "value": 53.502615048520354}, {"type": "nauc_ndcg_at_1_std", "value": 27.17555908836708}, {"type": "nauc_ndcg_at_20_diff1", "value": 29.374791382330308}, {"type": "nauc_ndcg_at_20_max", "value": 43.91246842479055}, {"type": "nauc_ndcg_at_20_std", "value": 23.419410620550316}, {"type": "nauc_ndcg_at_3_diff1", "value": 26.71550354496204}, {"type": "nauc_ndcg_at_3_max", "value": 43.9641457892003}, {"type": "nauc_ndcg_at_3_std", "value": 27.320024167947686}, {"type": "nauc_ndcg_at_5_diff1", "value": 27.020654974589487}, {"type": "nauc_ndcg_at_5_max", "value": 46.130417266030584}, {"type": "nauc_ndcg_at_5_std", "value": 28.392009019010068}, {"type": "nauc_precision_at_1000_diff1", "value": -21.47455482181002}, {"type": "nauc_precision_at_1000_max", "value": -9.721907229236024}, {"type": "nauc_precision_at_1000_std", "value": -1.061132062651487}, {"type": "nauc_precision_at_100_diff1", "value": -12.35759246101943}, {"type": "nauc_precision_at_100_max", "value": 15.509512444892168}, {"type": "nauc_precision_at_100_std", "value": 36.21183578592014}, {"type": "nauc_precision_at_10_diff1", "value": -6.136998947343125}, {"type": "nauc_precision_at_10_max", "value": 32.30037906748288}, {"type": "nauc_precision_at_10_std", "value": 41.4500302476981}, {"type": "nauc_precision_at_1_diff1", "value": 56.45461238513347}, {"type": "nauc_precision_at_1_max", "value": 63.11782510293676}, {"type": "nauc_precision_at_1_std", "value": 33.592561284868985}, {"type": "nauc_precision_at_20_diff1", "value": -7.335890123683174}, {"type": "nauc_precision_at_20_max", "value": 28.31417075291312}, {"type": "nauc_precision_at_20_std", "value": 41.405935715061815}, {"type": "nauc_precision_at_3_diff1", "value": 7.117255890225942}, {"type": "nauc_precision_at_3_max", "value": 39.19894132683829}, {"type": "nauc_precision_at_3_std", "value": 38.48255841994843}, {"type": "nauc_precision_at_5_diff1", "value": 1.861523090114206}, {"type": "nauc_precision_at_5_max", "value": 38.11649223007208}, {"type": "nauc_precision_at_5_std", "value": 40.52993530374645}, {"type": "nauc_recall_at_1000_diff1", "value": 26.497648584314636}, {"type": "nauc_recall_at_1000_max", "value": 44.48069746734414}, {"type": "nauc_recall_at_1000_std", "value": 53.16438130228715}, {"type": "nauc_recall_at_100_diff1", "value": 26.353456899511446}, {"type": "nauc_recall_at_100_max", "value": 37.57379787884197}, {"type": "nauc_recall_at_100_std", "value": 29.197468295989548}, {"type": "nauc_recall_at_10_diff1", "value": 22.80445738351114}, {"type": "nauc_recall_at_10_max", "value": 15.895630778449046}, {"type": "nauc_recall_at_10_std", "value": -8.746224797644501}, {"type": "nauc_recall_at_1_diff1", "value": 35.20851030208876}, {"type": "nauc_recall_at_1_max", "value": 5.783003346365858}, {"type": "nauc_recall_at_1_std", "value": -21.11679133835354}, {"type": "nauc_recall_at_20_diff1", "value": 22.34028867678706}, {"type": "nauc_recall_at_20_max", "value": 21.42373427646772}, {"type": "nauc_recall_at_20_std", "value": 0.4533036151015875}, {"type": "nauc_recall_at_3_diff1", "value": 24.96853445599229}, {"type": "nauc_recall_at_3_max", "value": 6.245185375804208}, {"type": "nauc_recall_at_3_std", "value": -20.200240127099622}, {"type": "nauc_recall_at_5_diff1", "value": 24.749259476710623}, {"type": "nauc_recall_at_5_max", "value": 11.024592845995942}, {"type": "nauc_recall_at_5_std", "value": -16.15683085641543}, {"type": "ndcg_at_1", "value": 64.125}, {"type": "ndcg_at_10", "value": 52.276999999999994}, {"type": "ndcg_at_100", "value": 57.440000000000005}, {"type": "ndcg_at_1000", "value": 64.082}, {"type": "ndcg_at_20", "value": 51.383}, {"type": "ndcg_at_3", "value": 55.769000000000005}, {"type": "ndcg_at_5", "value": 53.978}, {"type": "precision_at_1", "value": 76.25}, {"type": "precision_at_10", "value": 43.05}, {"type": "precision_at_100", "value": 14.09}, {"type": "precision_at_1000", "value": 2.662}, {"type": "precision_at_20", "value": 33.112}, {"type": "precision_at_3", "value": 59.833000000000006}, {"type": "precision_at_5", "value": 53.05}, {"type": "recall_at_1", "value": 9.949}, {"type": "recall_at_10", "value": 30.424}, {"type": "recall_at_100", "value": 64.062}, {"type": "recall_at_1000", "value": 85.916}, {"type": "recall_at_20", "value": 39.895}, {"type": "recall_at_3", "value": 17.876}, {"type": "recall_at_5", "value": 22.536}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB EmotionClassification", "type": "mteb/emotion", "config": "default", "split": "test", "revision": "4f58c6b202a23cf9a4da393831edf4f9183cad37"}, "metrics": [{"type": "accuracy", "value": 84.29499999999999}, {"type": "f1", "value": 79.76188258172078}, {"type": "f1_weighted", "value": 84.96026012933847}, {"type": "main_score", "value": 84.29499999999999}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FEVER", "type": "mteb/fever", "config": "default", "split": "test", "revision": "bea83ef9e8fb933d90a2f1d5515737465d613e12"}, "metrics": [{"type": "main_score", "value": 94.83200000000001}, {"type": "map_at_1", "value": 87.339}, {"type": "map_at_10", "value": 92.92099999999999}, {"type": "map_at_100", "value": 93.108}, {"type": "map_at_1000", "value": 93.116}, {"type": "map_at_20", "value": 93.041}, {"type": "map_at_3", "value": 92.219}, {"type": "map_at_5", "value": 92.664}, {"type": "mrr_at_1", "value": 93.99939993999399}, {"type": "mrr_at_10", "value": 96.55188137861403}, {"type": "mrr_at_100", "value": 96.5652366009286}, {"type": "mrr_at_1000", "value": 96.5652625550811}, {"type": "mrr_at_20", "value": 96.5601781754844}, {"type": "mrr_at_3", "value": 96.45714571457142}, {"type": "mrr_at_5", "value": 96.544904490449}, {"type": "nauc_map_at_1000_diff1", "value": 51.81676454961933}, {"type": "nauc_map_at_1000_max", "value": 24.904822914926118}, {"type": "nauc_map_at_1000_std", "value": -3.8110347821630404}, {"type": "nauc_map_at_100_diff1", "value": 51.77514975011158}, {"type": "nauc_map_at_100_max", "value": 24.912497341800094}, {"type": "nauc_map_at_100_std", "value": -3.76229517662447}, {"type": "nauc_map_at_10_diff1", "value": 51.29608296382479}, {"type": "nauc_map_at_10_max", "value": 24.78704970246707}, {"type": "nauc_map_at_10_std", "value": -3.723130815783328}, {"type": "nauc_map_at_1_diff1", "value": 59.90813138005125}, {"type": "nauc_map_at_1_max", "value": 24.58479295693794}, {"type": "nauc_map_at_1_std", "value": -8.056152492777027}, {"type": "nauc_map_at_20_diff1", "value": 51.428639331678326}, {"type": "nauc_map_at_20_max", "value": 24.849214517705086}, {"type": "nauc_map_at_20_std", "value": -3.685550123874596}, {"type": "nauc_map_at_3_diff1", "value": 50.94399923719279}, {"type": "nauc_map_at_3_max", "value": 24.359700180006207}, {"type": "nauc_map_at_3_std", "value": -5.407767408816422}, {"type": "nauc_map_at_5_diff1", "value": 50.767302682959546}, {"type": "nauc_map_at_5_max", "value": 24.491113461892215}, {"type": "nauc_map_at_5_std", "value": -4.058336127339082}, {"type": "nauc_mrr_at_1000_diff1", "value": 79.86042313551833}, {"type": "nauc_mrr_at_1000_max", "value": 23.20960445633933}, {"type": "nauc_mrr_at_1000_std", "value": -23.54334295120471}, {"type": "nauc_mrr_at_100_diff1", "value": 79.85991247027636}, {"type": "nauc_mrr_at_100_max", "value": 23.210085926780106}, {"type": "nauc_mrr_at_100_std", "value": -23.542508200789197}, {"type": "nauc_mrr_at_10_diff1", "value": 79.71095155563415}, {"type": "nauc_mrr_at_10_max", "value": 23.24128650883908}, {"type": "nauc_mrr_at_10_std", "value": -23.408502781834102}, {"type": "nauc_mrr_at_1_diff1", "value": 82.6349900233902}, {"type": "nauc_mrr_at_1_max", "value": 21.994548214014227}, {"type": "nauc_mrr_at_1_std", "value": -22.549769792179262}, {"type": "nauc_mrr_at_20_diff1", "value": 79.76465012873038}, {"type": "nauc_mrr_at_20_max", "value": 23.17575026523213}, {"type": "nauc_mrr_at_20_std", "value": -23.492660166315048}, {"type": "nauc_mrr_at_3_diff1", "value": 79.91074933379953}, {"type": "nauc_mrr_at_3_max", "value": 24.14246499097892}, {"type": "nauc_mrr_at_3_std", "value": -25.22601708389664}, {"type": "nauc_mrr_at_5_diff1", "value": 79.62092651565847}, {"type": "nauc_mrr_at_5_max", "value": 23.315937737034425}, {"type": "nauc_mrr_at_5_std", "value": -23.317659360058403}, {"type": "nauc_ndcg_at_1000_diff1", "value": 54.404537986779225}, {"type": "nauc_ndcg_at_1000_max", "value": 25.38408304128995}, {"type": "nauc_ndcg_at_1000_std", "value": -4.916709117696968}, {"type": "nauc_ndcg_at_100_diff1", "value": 53.2448598868241}, {"type": "nauc_ndcg_at_100_max", "value": 25.75325255295546}, {"type": "nauc_ndcg_at_100_std", "value": -3.680507005630751}, {"type": "nauc_ndcg_at_10_diff1", "value": 50.81057355170232}, {"type": "nauc_ndcg_at_10_max", "value": 25.006448273343807}, {"type": "nauc_ndcg_at_10_std", "value": -2.8979899112515577}, {"type": "nauc_ndcg_at_1_diff1", "value": 82.6349900233902}, {"type": "nauc_ndcg_at_1_max", "value": 21.994548214014227}, {"type": "nauc_ndcg_at_1_std", "value": -22.549769792179262}, {"type": "nauc_ndcg_at_20_diff1", "value": 51.205023097166304}, {"type": "nauc_ndcg_at_20_max", "value": 25.22133626556826}, {"type": "nauc_ndcg_at_20_std", "value": -2.9506328244150155}, {"type": "nauc_ndcg_at_3_diff1", "value": 51.79780256736321}, {"type": "nauc_ndcg_at_3_max", "value": 24.81137324438439}, {"type": "nauc_ndcg_at_3_std", "value": -6.881223858227807}, {"type": "nauc_ndcg_at_5_diff1", "value": 50.290038260564565}, {"type": "nauc_ndcg_at_5_max", "value": 24.57250792165796}, {"type": "nauc_ndcg_at_5_std", "value": -3.5124628344654596}, {"type": "nauc_precision_at_1000_diff1", "value": -20.215211396894333}, {"type": "nauc_precision_at_1000_max", "value": -14.165452298769171}, {"type": "nauc_precision_at_1000_std", "value": -2.0952871214470816}, {"type": "nauc_precision_at_100_diff1", "value": -22.340257474494607}, {"type": "nauc_precision_at_100_max", "value": -12.697885641360282}, {"type": "nauc_precision_at_100_std", "value": 1.0688624940286244}, {"type": "nauc_precision_at_10_diff1", "value": -24.78271817420798}, {"type": "nauc_precision_at_10_max", "value": -12.625257500222656}, {"type": "nauc_precision_at_10_std", "value": 3.223250450607087}, {"type": "nauc_precision_at_1_diff1", "value": 82.6349900233902}, {"type": "nauc_precision_at_1_max", "value": 21.994548214014227}, {"type": "nauc_precision_at_1_std", "value": -22.549769792179262}, {"type": "nauc_precision_at_20_diff1", "value": -24.375756227194177}, {"type": "nauc_precision_at_20_max", "value": -12.341015011563536}, {"type": "nauc_precision_at_20_std", "value": 2.7475274619387955}, {"type": "nauc_precision_at_3_diff1", "value": -24.8251306777365}, {"type": "nauc_precision_at_3_max", "value": -13.109579709589042}, {"type": "nauc_precision_at_3_std", "value": -1.2233442335420748}, {"type": "nauc_precision_at_5_diff1", "value": -26.955418583344894}, {"type": "nauc_precision_at_5_max", "value": -13.598630838071015}, {"type": "nauc_precision_at_5_std", "value": 2.545780631940738}, {"type": "nauc_recall_at_1000_diff1", "value": 0.2542680835344437}, {"type": "nauc_recall_at_1000_max", "value": 49.38194243035277}, {"type": "nauc_recall_at_1000_std", "value": 57.021502715846026}, {"type": "nauc_recall_at_100_diff1", "value": 5.062154815367015}, {"type": "nauc_recall_at_100_max", "value": 45.41178380188437}, {"type": "nauc_recall_at_100_std", "value": 50.78382225901813}, {"type": "nauc_recall_at_10_diff1", "value": 20.429153629007818}, {"type": "nauc_recall_at_10_max", "value": 27.516855026155508}, {"type": "nauc_recall_at_10_std", "value": 21.367491371755467}, {"type": "nauc_recall_at_1_diff1", "value": 59.90813138005125}, {"type": "nauc_recall_at_1_max", "value": 24.58479295693794}, {"type": "nauc_recall_at_1_std", "value": -8.056152492777027}, {"type": "nauc_recall_at_20_diff1", "value": 13.072430858896942}, {"type": "nauc_recall_at_20_max", "value": 29.5522659183247}, {"type": "nauc_recall_at_20_std", "value": 28.70569974090291}, {"type": "nauc_recall_at_3_diff1", "value": 30.419084482663617}, {"type": "nauc_recall_at_3_max", "value": 25.627389580252835}, {"type": "nauc_recall_at_3_std", "value": 2.5557690877637054}, {"type": "nauc_recall_at_5_diff1", "value": 22.92561435069869}, {"type": "nauc_recall_at_5_max", "value": 25.545265063475455}, {"type": "nauc_recall_at_5_std", "value": 14.736172663072786}, {"type": "ndcg_at_1", "value": 93.999}, {"type": "ndcg_at_10", "value": 94.83200000000001}, {"type": "ndcg_at_100", "value": 95.363}, {"type": "ndcg_at_1000", "value": 95.478}, {"type": "ndcg_at_20", "value": 95.077}, {"type": "ndcg_at_3", "value": 94.143}, {"type": "ndcg_at_5", "value": 94.525}, {"type": "precision_at_1", "value": 93.999}, {"type": "precision_at_10", "value": 11.029}, {"type": "precision_at_100", "value": 1.1560000000000001}, {"type": "precision_at_1000", "value": 0.11800000000000001}, {"type": "precision_at_20", "value": 5.62}, {"type": "precision_at_3", "value": 35.219}, {"type": "precision_at_5", "value": 21.584}, {"type": "recall_at_1", "value": 87.339}, {"type": "recall_at_10", "value": 97.026}, {"type": "recall_at_100", "value": 98.936}, {"type": "recall_at_1000", "value": 99.599}, {"type": "recall_at_20", "value": 97.744}, {"type": "recall_at_3", "value": 95.069}, {"type": "recall_at_5", "value": 96.177}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FiQA2018", "type": "mteb/fiqa", "config": "default", "split": "test", "revision": "27a168819829fe9bcd655c2df245fb19452e8e06"}, "metrics": [{"type": "main_score", "value": 60.480000000000004}, {"type": "map_at_1", "value": 31.529}, {"type": "map_at_10", "value": 52.081}, {"type": "map_at_100", "value": 54.342}, {"type": "map_at_1000", "value": 54.449000000000005}, {"type": "map_at_20", "value": 53.479}, {"type": "map_at_3", "value": 45.471000000000004}, {"type": "map_at_5", "value": 49.164}, {"type": "mrr_at_1", "value": 60.03086419753087}, {"type": "mrr_at_10", "value": 67.73754409171075}, {"type": "mrr_at_100", "value": 68.332432152368}, {"type": "mrr_at_1000", "value": 68.34150941774908}, {"type": "mrr_at_20", "value": 68.14780993838725}, {"type": "mrr_at_3", "value": 65.6378600823045}, {"type": "mrr_at_5", "value": 66.88014403292176}, {"type": "nauc_map_at_1000_diff1", "value": 45.36598134579052}, {"type": "nauc_map_at_1000_max", "value": 31.891451119906943}, {"type": "nauc_map_at_1000_std", "value": -15.41454384137943}, {"type": "nauc_map_at_100_diff1", "value": 45.31268291874018}, {"type": "nauc_map_at_100_max", "value": 31.811055683002092}, {"type": "nauc_map_at_100_std", "value": -15.348503855591417}, {"type": "nauc_map_at_10_diff1", "value": 45.22606983565892}, {"type": "nauc_map_at_10_max", "value": 30.46108534749699}, {"type": "nauc_map_at_10_std", "value": -16.618086029682555}, {"type": "nauc_map_at_1_diff1", "value": 49.94952823753276}, {"type": "nauc_map_at_1_max", "value": 13.770377574254548}, {"type": "nauc_map_at_1_std", "value": -14.946357968858653}, {"type": "nauc_map_at_20_diff1", "value": 45.29274207897926}, {"type": "nauc_map_at_20_max", "value": 31.27332015148257}, {"type": "nauc_map_at_20_std", "value": -15.782946115613129}, {"type": "nauc_map_at_3_diff1", "value": 47.94248233566038}, {"type": "nauc_map_at_3_max", "value": 24.022838776825456}, {"type": "nauc_map_at_3_std", "value": -17.103518542262208}, {"type": "nauc_map_at_5_diff1", "value": 45.85345590031722}, {"type": "nauc_map_at_5_max", "value": 27.78341379004547}, {"type": "nauc_map_at_5_std", "value": -17.490850791756326}, {"type": "nauc_mrr_at_1000_diff1", "value": 58.225141047822824}, {"type": "nauc_mrr_at_1000_max", "value": 43.39606904140525}, {"type": "nauc_mrr_at_1000_std", "value": -14.64093518199122}, {"type": "nauc_mrr_at_100_diff1", "value": 58.22137274179545}, {"type": "nauc_mrr_at_100_max", "value": 43.39567568136935}, {"type": "nauc_mrr_at_100_std", "value": -14.62512313985582}, {"type": "nauc_mrr_at_10_diff1", "value": 58.03217329957151}, {"type": "nauc_mrr_at_10_max", "value": 43.633561683075186}, {"type": "nauc_mrr_at_10_std", "value": -14.563703576023808}, {"type": "nauc_mrr_at_1_diff1", "value": 61.48979902647692}, {"type": "nauc_mrr_at_1_max", "value": 43.1938079066948}, {"type": "nauc_mrr_at_1_std", "value": -15.808138277440465}, {"type": "nauc_mrr_at_20_diff1", "value": 58.13185370150794}, {"type": "nauc_mrr_at_20_max", "value": 43.35607721183147}, {"type": "nauc_mrr_at_20_std", "value": -14.635812702971263}, {"type": "nauc_mrr_at_3_diff1", "value": 58.698963168321264}, {"type": "nauc_mrr_at_3_max", "value": 43.633129249785405}, {"type": "nauc_mrr_at_3_std", "value": -15.733246346983854}, {"type": "nauc_mrr_at_5_diff1", "value": 57.94156745229547}, {"type": "nauc_mrr_at_5_max", "value": 43.14152462640525}, {"type": "nauc_mrr_at_5_std", "value": -15.318685307750895}, {"type": "nauc_ndcg_at_1000_diff1", "value": 47.871896043731496}, {"type": "nauc_ndcg_at_1000_max", "value": 37.159845167533426}, {"type": "nauc_ndcg_at_1000_std", "value": -13.067288160833485}, {"type": "nauc_ndcg_at_100_diff1", "value": 47.046171407204426}, {"type": "nauc_ndcg_at_100_max", "value": 36.422514360855835}, {"type": "nauc_ndcg_at_100_std", "value": -11.636859259571441}, {"type": "nauc_ndcg_at_10_diff1", "value": 46.232628149078096}, {"type": "nauc_ndcg_at_10_max", "value": 34.82402625088358}, {"type": "nauc_ndcg_at_10_std", "value": -14.768545542980114}, {"type": "nauc_ndcg_at_1_diff1", "value": 61.48979902647692}, {"type": "nauc_ndcg_at_1_max", "value": 43.1938079066948}, {"type": "nauc_ndcg_at_1_std", "value": -15.808138277440465}, {"type": "nauc_ndcg_at_20_diff1", "value": 46.51116172390955}, {"type": "nauc_ndcg_at_20_max", "value": 35.36362650568298}, {"type": "nauc_ndcg_at_20_std", "value": -12.849406209182826}, {"type": "nauc_ndcg_at_3_diff1", "value": 47.39832263785871}, {"type": "nauc_ndcg_at_3_max", "value": 35.67466264628456}, {"type": "nauc_ndcg_at_3_std", "value": -17.257717349296943}, {"type": "nauc_ndcg_at_5_diff1", "value": 45.91049493804232}, {"type": "nauc_ndcg_at_5_max", "value": 33.8405091138445}, {"type": "nauc_ndcg_at_5_std", "value": -17.477069902735895}, {"type": "nauc_precision_at_1000_diff1", "value": -12.037873000917767}, {"type": "nauc_precision_at_1000_max", "value": 26.043220150002295}, {"type": "nauc_precision_at_1000_std", "value": 6.84910668321572}, {"type": "nauc_precision_at_100_diff1", "value": -9.383403459051864}, {"type": "nauc_precision_at_100_max", "value": 29.68713170610003}, {"type": "nauc_precision_at_100_std", "value": 10.079531587056152}, {"type": "nauc_precision_at_10_diff1", "value": 3.3433323353925135}, {"type": "nauc_precision_at_10_max", "value": 38.31790111725993}, {"type": "nauc_precision_at_10_std", "value": 0.7888123304710856}, {"type": "nauc_precision_at_1_diff1", "value": 61.48979902647692}, {"type": "nauc_precision_at_1_max", "value": 43.1938079066948}, {"type": "nauc_precision_at_1_std", "value": -15.808138277440465}, {"type": "nauc_precision_at_20_diff1", "value": -2.083500986294448}, {"type": "nauc_precision_at_20_max", "value": 35.77143835726343}, {"type": "nauc_precision_at_20_std", "value": 5.318547021874003}, {"type": "nauc_precision_at_3_diff1", "value": 23.335617788912586}, {"type": "nauc_precision_at_3_max", "value": 39.81973275320871}, {"type": "nauc_precision_at_3_std", "value": -8.442769390555561}, {"type": "nauc_precision_at_5_diff1", "value": 11.521087842589482}, {"type": "nauc_precision_at_5_max", "value": 39.527792539828255}, {"type": "nauc_precision_at_5_std", "value": -5.412729503701626}, {"type": "nauc_recall_at_1000_diff1", "value": 10.6830893047453}, {"type": "nauc_recall_at_1000_max", "value": 8.834504311238423}, {"type": "nauc_recall_at_1000_std", "value": 24.670754304859692}, {"type": "nauc_recall_at_100_diff1", "value": 20.646020385527358}, {"type": "nauc_recall_at_100_max", "value": 20.121595011523294}, {"type": "nauc_recall_at_100_std", "value": 19.42307459311791}, {"type": "nauc_recall_at_10_diff1", "value": 33.01029313733417}, {"type": "nauc_recall_at_10_max", "value": 27.948634980368702}, {"type": "nauc_recall_at_10_std", "value": -10.239767371462975}, {"type": "nauc_recall_at_1_diff1", "value": 49.94952823753276}, {"type": "nauc_recall_at_1_max", "value": 13.770377574254548}, {"type": "nauc_recall_at_1_std", "value": -14.946357968858653}, {"type": "nauc_recall_at_20_diff1", "value": 30.040111045267963}, {"type": "nauc_recall_at_20_max", "value": 25.984919302418184}, {"type": "nauc_recall_at_20_std", "value": -1.4998001817460804}, {"type": "nauc_recall_at_3_diff1", "value": 42.24410559113653}, {"type": "nauc_recall_at_3_max", "value": 20.269503583626914}, {"type": "nauc_recall_at_3_std", "value": -17.09578532600584}, {"type": "nauc_recall_at_5_diff1", "value": 36.124149735848945}, {"type": "nauc_recall_at_5_max", "value": 22.708022306002622}, {"type": "nauc_recall_at_5_std", "value": -16.966976847236193}, {"type": "ndcg_at_1", "value": 60.031}, {"type": "ndcg_at_10", "value": 60.480000000000004}, {"type": "ndcg_at_100", "value": 66.94099999999999}, {"type": "ndcg_at_1000", "value": 68.303}, {"type": "ndcg_at_20", "value": 63.536}, {"type": "ndcg_at_3", "value": 55.903999999999996}, {"type": "ndcg_at_5", "value": 57.387}, {"type": "precision_at_1", "value": 60.031}, {"type": "precision_at_10", "value": 16.682}, {"type": "precision_at_100", "value": 2.336}, {"type": "precision_at_1000", "value": 0.259}, {"type": "precision_at_20", "value": 9.66}, {"type": "precision_at_3", "value": 37.191}, {"type": "precision_at_5", "value": 27.253}, {"type": "recall_at_1", "value": 31.529}, {"type": "recall_at_10", "value": 68.035}, {"type": "recall_at_100", "value": 90.925}, {"type": "recall_at_1000", "value": 98.688}, {"type": "recall_at_20", "value": 77.453}, {"type": "recall_at_3", "value": 50.221000000000004}, {"type": "recall_at_5", "value": 58.209999999999994}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB HotpotQA", "type": "mteb/hotpotqa", "config": "default", "split": "test", "revision": "ab518f4d6fcca38d87c25209f94beba119d02014"}, "metrics": [{"type": "main_score", "value": 76.67399999999999}, {"type": "map_at_1", "value": 43.822}, {"type": "map_at_10", "value": 68.82000000000001}, {"type": "map_at_100", "value": 69.659}, {"type": "map_at_1000", "value": 69.714}, {"type": "map_at_20", "value": 69.305}, {"type": "map_at_3", "value": 65.517}, {"type": "map_at_5", "value": 67.633}, {"type": "mrr_at_1", "value": 87.643484132343}, {"type": "mrr_at_10", "value": 91.28134679485098}, {"type": "mrr_at_100", "value": 91.37985230614755}, {"type": "mrr_at_1000", "value": 91.38202467630681}, {"type": "mrr_at_20", "value": 91.34718855278429}, {"type": "mrr_at_3", "value": 90.75849651136599}, {"type": "mrr_at_5", "value": 91.10961062345235}, {"type": "nauc_map_at_1000_diff1", "value": 3.7670405082837477}, {"type": "nauc_map_at_1000_max", "value": 14.410594409695182}, {"type": "nauc_map_at_1000_std", "value": 7.94738583292685}, {"type": "nauc_map_at_100_diff1", "value": 3.738796209193936}, {"type": "nauc_map_at_100_max", "value": 14.408029101534694}, {"type": "nauc_map_at_100_std", "value": 7.979641077687816}, {"type": "nauc_map_at_10_diff1", "value": 3.334917978089454}, {"type": "nauc_map_at_10_max", "value": 13.975255289147748}, {"type": "nauc_map_at_10_std", "value": 7.491959628012161}, {"type": "nauc_map_at_1_diff1", "value": 75.35066482050009}, {"type": "nauc_map_at_1_max", "value": 53.573503488571475}, {"type": "nauc_map_at_1_std", "value": -6.542030594426993}, {"type": "nauc_map_at_20_diff1", "value": 3.5197129341582083}, {"type": "nauc_map_at_20_max", "value": 14.159880698006816}, {"type": "nauc_map_at_20_std", "value": 7.856574384998483}, {"type": "nauc_map_at_3_diff1", "value": 3.0992333232864064}, {"type": "nauc_map_at_3_max", "value": 12.513959281222112}, {"type": "nauc_map_at_3_std", "value": 4.352912866014865}, {"type": "nauc_map_at_5_diff1", "value": 3.0351688998572537}, {"type": "nauc_map_at_5_max", "value": 13.21599457624529}, {"type": "nauc_map_at_5_std", "value": 6.246882983214777}, {"type": "nauc_mrr_at_1000_diff1", "value": 75.23953736361132}, {"type": "nauc_mrr_at_1000_max", "value": 56.64260717262164}, {"type": "nauc_mrr_at_1000_std", "value": -4.865932053762276}, {"type": "nauc_mrr_at_100_diff1", "value": 75.24091372816497}, {"type": "nauc_mrr_at_100_max", "value": 56.64831104504846}, {"type": "nauc_mrr_at_100_std", "value": -4.850966297943324}, {"type": "nauc_mrr_at_10_diff1", "value": 75.26540178053416}, {"type": "nauc_mrr_at_10_max", "value": 56.828755673428965}, {"type": "nauc_mrr_at_10_std", "value": -4.8401126970944635}, {"type": "nauc_mrr_at_1_diff1", "value": 75.35066482050009}, {"type": "nauc_mrr_at_1_max", "value": 53.573503488571475}, {"type": "nauc_mrr_at_1_std", "value": -6.542030594426993}, {"type": "nauc_mrr_at_20_diff1", "value": 75.24453050729845}, {"type": "nauc_mrr_at_20_max", "value": 56.69220588401435}, {"type": "nauc_mrr_at_20_std", "value": -4.843700730832108}, {"type": "nauc_mrr_at_3_diff1", "value": 74.98411648336175}, {"type": "nauc_mrr_at_3_max", "value": 56.766537573537114}, {"type": "nauc_mrr_at_3_std", "value": -4.909712671649337}, {"type": "nauc_mrr_at_5_diff1", "value": 75.20599020991028}, {"type": "nauc_mrr_at_5_max", "value": 56.64236207782237}, {"type": "nauc_mrr_at_5_std", "value": -5.208907367513977}, {"type": "nauc_ndcg_at_1000_diff1", "value": 11.48307079099774}, {"type": "nauc_ndcg_at_1000_max", "value": 20.893326881675176}, {"type": "nauc_ndcg_at_1000_std", "value": 10.43489838692119}, {"type": "nauc_ndcg_at_100_diff1", "value": 10.395588735754927}, {"type": "nauc_ndcg_at_100_max", "value": 20.529573302516912}, {"type": "nauc_ndcg_at_100_std", "value": 11.252973083654268}, {"type": "nauc_ndcg_at_10_diff1", "value": 8.596739352741972}, {"type": "nauc_ndcg_at_10_max", "value": 18.475863682540673}, {"type": "nauc_ndcg_at_10_std", "value": 9.175831033463352}, {"type": "nauc_ndcg_at_1_diff1", "value": 75.35066482050009}, {"type": "nauc_ndcg_at_1_max", "value": 53.573503488571475}, {"type": "nauc_ndcg_at_1_std", "value": -6.542030594426993}, {"type": "nauc_ndcg_at_20_diff1", "value": 8.998033972471749}, {"type": "nauc_ndcg_at_20_max", "value": 18.892085875404522}, {"type": "nauc_ndcg_at_20_std", "value": 10.3241608901084}, {"type": "nauc_ndcg_at_3_diff1", "value": 8.796384949533579}, {"type": "nauc_ndcg_at_3_max", "value": 16.515261419885274}, {"type": "nauc_ndcg_at_3_std", "value": 4.081902976576701}, {"type": "nauc_ndcg_at_5_diff1", "value": 8.277259464605025}, {"type": "nauc_ndcg_at_5_max", "value": 17.163053202909527}, {"type": "nauc_ndcg_at_5_std", "value": 6.652669449704474}, {"type": "nauc_precision_at_1000_diff1", "value": -3.490556596304827}, {"type": "nauc_precision_at_1000_max", "value": 31.0473259001597}, {"type": "nauc_precision_at_1000_std", "value": 52.36921397692622}, {"type": "nauc_precision_at_100_diff1", "value": -6.420747959222489}, {"type": "nauc_precision_at_100_max", "value": 20.555887056005936}, {"type": "nauc_precision_at_100_std", "value": 36.119132870798495}, {"type": "nauc_precision_at_10_diff1", "value": -6.461726057290426}, {"type": "nauc_precision_at_10_max", "value": 12.161081825341915}, {"type": "nauc_precision_at_10_std", "value": 17.961318451839993}, {"type": "nauc_precision_at_1_diff1", "value": 75.35066482050009}, {"type": "nauc_precision_at_1_max", "value": 53.573503488571475}, {"type": "nauc_precision_at_1_std", "value": -6.542030594426993}, {"type": "nauc_precision_at_20_diff1", "value": -7.361461296416161}, {"type": "nauc_precision_at_20_max", "value": 12.663621261696733}, {"type": "nauc_precision_at_20_std", "value": 23.312476851670286}, {"type": "nauc_precision_at_3_diff1", "value": -3.299056912774522}, {"type": "nauc_precision_at_3_max", "value": 9.85602375812038}, {"type": "nauc_precision_at_3_std", "value": 6.4962782003155475}, {"type": "nauc_precision_at_5_diff1", "value": -5.3155827772027795}, {"type": "nauc_precision_at_5_max", "value": 10.32907751171833}, {"type": "nauc_precision_at_5_std", "value": 11.384098087196932}, {"type": "nauc_recall_at_1000_diff1", "value": -3.4905565963043332}, {"type": "nauc_recall_at_1000_max", "value": 31.04732590016041}, {"type": "nauc_recall_at_1000_std", "value": 52.36921397692641}, {"type": "nauc_recall_at_100_diff1", "value": -6.420747959222586}, {"type": "nauc_recall_at_100_max", "value": 20.55588705600596}, {"type": "nauc_recall_at_100_std", "value": 36.11913287079825}, {"type": "nauc_recall_at_10_diff1", "value": -6.461726057290347}, {"type": "nauc_recall_at_10_max", "value": 12.161081825342022}, {"type": "nauc_recall_at_10_std", "value": 17.96131845184002}, {"type": "nauc_recall_at_1_diff1", "value": 75.35066482050009}, {"type": "nauc_recall_at_1_max", "value": 53.573503488571475}, {"type": "nauc_recall_at_1_std", "value": -6.542030594426993}, {"type": "nauc_recall_at_20_diff1", "value": -7.361461296416054}, {"type": "nauc_recall_at_20_max", "value": 12.66362126169679}, {"type": "nauc_recall_at_20_std", "value": 23.312476851670382}, {"type": "nauc_recall_at_3_diff1", "value": -3.2990569127745886}, {"type": "nauc_recall_at_3_max", "value": 9.856023758120296}, {"type": "nauc_recall_at_3_std", "value": 6.496278200315444}, {"type": "nauc_recall_at_5_diff1", "value": -5.315582777202729}, {"type": "nauc_recall_at_5_max", "value": 10.329077511718229}, {"type": "nauc_recall_at_5_std", "value": 11.384098087196932}, {"type": "ndcg_at_1", "value": 87.643}, {"type": "ndcg_at_10", "value": 76.67399999999999}, {"type": "ndcg_at_100", "value": 79.462}, {"type": "ndcg_at_1000", "value": 80.43599999999999}, {"type": "ndcg_at_20", "value": 77.83}, {"type": "ndcg_at_3", "value": 72.256}, {"type": "ndcg_at_5", "value": 74.789}, {"type": "precision_at_1", "value": 87.643}, {"type": "precision_at_10", "value": 15.726999999999999}, {"type": "precision_at_100", "value": 1.791}, {"type": "precision_at_1000", "value": 0.192}, {"type": "precision_at_20", "value": 8.236}, {"type": "precision_at_3", "value": 45.919}, {"type": "precision_at_5", "value": 29.558}, {"type": "recall_at_1", "value": 43.822}, {"type": "recall_at_10", "value": 78.636}, {"type": "recall_at_100", "value": 89.527}, {"type": "recall_at_1000", "value": 95.868}, {"type": "recall_at_20", "value": 82.363}, {"type": "recall_at_3", "value": 68.879}, {"type": "recall_at_5", "value": 73.896}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ImdbClassification", "type": "mteb/imdb", "config": "default", "split": "test", "revision": "3d86128a09e091d6018b6d26cad27f2739fc2db7"}, "metrics": [{"type": "accuracy", "value": 96.6608}, {"type": "ap", "value": 95.14657820401189}, {"type": "ap_weighted", "value": 95.14657820401189}, {"type": "f1", "value": 96.66029695623422}, {"type": "f1_weighted", "value": 96.66029695623423}, {"type": "main_score", "value": 96.6608}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MSMARCO", "type": "mteb/msmarco", "config": "default", "split": "dev", "revision": "c5a29a104738b98a9e76336939199e264163d4a0"}, "metrics": [{"type": "main_score", "value": 45.217}, {"type": "map_at_1", "value": 24.728}, {"type": "map_at_10", "value": 37.933}, {"type": "map_at_100", "value": 39.074999999999996}, {"type": "map_at_1000", "value": 39.115}, {"type": "map_at_20", "value": 38.663}, {"type": "map_at_3", "value": 33.904}, {"type": "map_at_5", "value": 36.217}, {"type": "mrr_at_1", "value": 25.44412607449857}, {"type": "mrr_at_10", "value": 38.52640196479737}, {"type": "mrr_at_100", "value": 39.60462889736067}, {"type": "mrr_at_1000", "value": 39.638904296248526}, {"type": "mrr_at_20", "value": 39.2234365827559}, {"type": "mrr_at_3", "value": 34.59646609360076}, {"type": "mrr_at_5", "value": 36.8801337153773}, {"type": "nauc_map_at_1000_diff1", "value": 37.645652178132174}, {"type": "nauc_map_at_1000_max", "value": 9.953357023361367}, {"type": "nauc_map_at_1000_std", "value": -20.800238036721503}, {"type": "nauc_map_at_100_diff1", "value": 37.643073495974555}, {"type": "nauc_map_at_100_max", "value": 9.95921239641703}, {"type": "nauc_map_at_100_std", "value": -20.76517765535793}, {"type": "nauc_map_at_10_diff1", "value": 37.44380763335014}, {"type": "nauc_map_at_10_max", "value": 9.917273043055342}, {"type": "nauc_map_at_10_std", "value": -21.467951225710898}, {"type": "nauc_map_at_1_diff1", "value": 41.02118887981969}, {"type": "nauc_map_at_1_max", "value": 8.301113449711778}, {"type": "nauc_map_at_1_std", "value": -19.436814224415027}, {"type": "nauc_map_at_20_diff1", "value": 37.58156586490493}, {"type": "nauc_map_at_20_max", "value": 9.972927967610659}, {"type": "nauc_map_at_20_std", "value": -20.951374218839387}, {"type": "nauc_map_at_3_diff1", "value": 37.67246795684178}, {"type": "nauc_map_at_3_max", "value": 9.307031378909478}, {"type": "nauc_map_at_3_std", "value": -21.77026217965021}, {"type": "nauc_map_at_5_diff1", "value": 37.39086482095963}, {"type": "nauc_map_at_5_max", "value": 9.732739107368566}, {"type": "nauc_map_at_5_std", "value": -21.8424296893692}, {"type": "nauc_mrr_at_1000_diff1", "value": 37.36666719603192}, {"type": "nauc_mrr_at_1000_max", "value": 9.79040465289953}, {"type": "nauc_mrr_at_1000_std", "value": -20.590147245965568}, {"type": "nauc_mrr_at_100_diff1", "value": 37.36560296629318}, {"type": "nauc_mrr_at_100_max", "value": 9.798113710672162}, {"type": "nauc_mrr_at_100_std", "value": -20.556791838504292}, {"type": "nauc_mrr_at_10_diff1", "value": 37.19257605840734}, {"type": "nauc_mrr_at_10_max", "value": 9.749429811638063}, {"type": "nauc_mrr_at_10_std", "value": -21.206407664327276}, {"type": "nauc_mrr_at_1_diff1", "value": 40.98478651095172}, {"type": "nauc_mrr_at_1_max", "value": 8.173841799119707}, {"type": "nauc_mrr_at_1_std", "value": -19.530027987868017}, {"type": "nauc_mrr_at_20_diff1", "value": 37.29973172861245}, {"type": "nauc_mrr_at_20_max", "value": 9.815127660001345}, {"type": "nauc_mrr_at_20_std", "value": -20.700860112175928}, {"type": "nauc_mrr_at_3_diff1", "value": 37.282848009425734}, {"type": "nauc_mrr_at_3_max", "value": 9.172741713108193}, {"type": "nauc_mrr_at_3_std", "value": -21.563630513502996}, {"type": "nauc_mrr_at_5_diff1", "value": 37.08609827303586}, {"type": "nauc_mrr_at_5_max", "value": 9.604643424273284}, {"type": "nauc_mrr_at_5_std", "value": -21.580110806494094}, {"type": "nauc_ndcg_at_1000_diff1", "value": 37.086587020218545}, {"type": "nauc_ndcg_at_1000_max", "value": 10.696860688467472}, {"type": "nauc_ndcg_at_1000_std", "value": -19.50989939916873}, {"type": "nauc_ndcg_at_100_diff1", "value": 37.03794531268128}, {"type": "nauc_ndcg_at_100_max", "value": 10.940820719182339}, {"type": "nauc_ndcg_at_100_std", "value": -18.28651832370893}, {"type": "nauc_ndcg_at_10_diff1", "value": 36.21062857920633}, {"type": "nauc_ndcg_at_10_max", "value": 10.845172882571733}, {"type": "nauc_ndcg_at_10_std", "value": -21.454301679510106}, {"type": "nauc_ndcg_at_1_diff1", "value": 40.98478651095172}, {"type": "nauc_ndcg_at_1_max", "value": 8.173841799119707}, {"type": "nauc_ndcg_at_1_std", "value": -19.530027987868017}, {"type": "nauc_ndcg_at_20_diff1", "value": 36.583262733100526}, {"type": "nauc_ndcg_at_20_max", "value": 11.10492720898974}, {"type": "nauc_ndcg_at_20_std", "value": -19.41753284137609}, {"type": "nauc_ndcg_at_3_diff1", "value": 36.57271365035382}, {"type": "nauc_ndcg_at_3_max", "value": 9.56073433062999}, {"type": "nauc_ndcg_at_3_std", "value": -22.324263670932915}, {"type": "nauc_ndcg_at_5_diff1", "value": 36.09419372820154}, {"type": "nauc_ndcg_at_5_max", "value": 10.357384992631271}, {"type": "nauc_ndcg_at_5_std", "value": -22.389578276324894}, {"type": "nauc_precision_at_1000_diff1", "value": -2.7435338714030597}, {"type": "nauc_precision_at_1000_max", "value": 4.302274933383809}, {"type": "nauc_precision_at_1000_std", "value": 8.456846348638948}, {"type": "nauc_precision_at_100_diff1", "value": 15.149466332615983}, {"type": "nauc_precision_at_100_max", "value": 12.501013731673163}, {"type": "nauc_precision_at_100_std", "value": 15.909667509021785}, {"type": "nauc_precision_at_10_diff1", "value": 28.699788688314214}, {"type": "nauc_precision_at_10_max", "value": 13.024586051842347}, {"type": "nauc_precision_at_10_std", "value": -19.197658937078703}, {"type": "nauc_precision_at_1_diff1", "value": 40.98478651095172}, {"type": "nauc_precision_at_1_max", "value": 8.173841799119707}, {"type": "nauc_precision_at_1_std", "value": -19.530027987868017}, {"type": "nauc_precision_at_20_diff1", "value": 26.519292942353395}, {"type": "nauc_precision_at_20_max", "value": 14.389979272056438}, {"type": "nauc_precision_at_20_std", "value": -7.030956994938155}, {"type": "nauc_precision_at_3_diff1", "value": 32.87913492278213}, {"type": "nauc_precision_at_3_max", "value": 9.673660161387776}, {"type": "nauc_precision_at_3_std", "value": -23.905612656592172}, {"type": "nauc_precision_at_5_diff1", "value": 30.903850113238597}, {"type": "nauc_precision_at_5_max", "value": 11.482375434154898}, {"type": "nauc_precision_at_5_std", "value": -23.828657095254247}, {"type": "nauc_recall_at_1000_diff1", "value": 35.80765639589219}, {"type": "nauc_recall_at_1000_max", "value": 50.94532805969448}, {"type": "nauc_recall_at_1000_std", "value": 66.79910877083275}, {"type": "nauc_recall_at_100_diff1", "value": 34.96182828311028}, {"type": "nauc_recall_at_100_max", "value": 21.729699631790556}, {"type": "nauc_recall_at_100_std", "value": 23.509439011686474}, {"type": "nauc_recall_at_10_diff1", "value": 31.88371369567137}, {"type": "nauc_recall_at_10_max", "value": 14.425389702697073}, {"type": "nauc_recall_at_10_std", "value": -20.95578001880924}, {"type": "nauc_recall_at_1_diff1", "value": 41.02118887981969}, {"type": "nauc_recall_at_1_max", "value": 8.301113449711778}, {"type": "nauc_recall_at_1_std", "value": -19.436814224415027}, {"type": "nauc_recall_at_20_diff1", "value": 32.42718780622455}, {"type": "nauc_recall_at_20_max", "value": 16.90686126329399}, {"type": "nauc_recall_at_20_std", "value": -9.38158227016737}, {"type": "nauc_recall_at_3_diff1", "value": 33.68966646043966}, {"type": "nauc_recall_at_3_max", "value": 10.336277419708532}, {"type": "nauc_recall_at_3_std", "value": -23.80165869168538}, {"type": "nauc_recall_at_5_diff1", "value": 32.26258807452426}, {"type": "nauc_recall_at_5_max", "value": 12.303713005399935}, {"type": "nauc_recall_at_5_std", "value": -23.87721891164968}, {"type": "ndcg_at_1", "value": 25.444}, {"type": "ndcg_at_10", "value": 45.217}, {"type": "ndcg_at_100", "value": 50.575}, {"type": "ndcg_at_1000", "value": 51.519999999999996}, {"type": "ndcg_at_20", "value": 47.786}, {"type": "ndcg_at_3", "value": 37.067}, {"type": "ndcg_at_5", "value": 41.184}, {"type": "precision_at_1", "value": 25.444}, {"type": "precision_at_10", "value": 7.07}, {"type": "precision_at_100", "value": 0.9730000000000001}, {"type": "precision_at_1000", "value": 0.106}, {"type": "precision_at_20", "value": 4.072}, {"type": "precision_at_3", "value": 15.754999999999999}, {"type": "precision_at_5", "value": 11.544}, {"type": "recall_at_1", "value": 24.728}, {"type": "recall_at_10", "value": 67.607}, {"type": "recall_at_100", "value": 92.094}, {"type": "recall_at_1000", "value": 99.165}, {"type": "recall_at_20", "value": 77.529}, {"type": "recall_at_3", "value": 45.535}, {"type": "recall_at_5", "value": 55.394}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPDomainClassification (en)", "type": "mteb/mtop_domain", "config": "en", "split": "test", "revision": "d80d48c1eb48d3562165c59d59d0034df9fff0bf"}, "metrics": [{"type": "accuracy", "value": 99.01276789785682}, {"type": "f1", "value": 98.9288649250924}, {"type": "f1_weighted", "value": 99.01406884928141}, {"type": "main_score", "value": 99.01276789785682}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPIntentClassification (en)", "type": "mteb/mtop_intent", "config": "en", "split": "test", "revision": "ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba"}, "metrics": [{"type": "accuracy", "value": 92.78385772913816}, {"type": "f1", "value": 79.78115704297824}, {"type": "f1_weighted", "value": 93.90424147486428}, {"type": "main_score", "value": 92.78385772913816}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (en)", "type": "mteb/amazon_massive_intent", "config": "en", "split": "test", "revision": "4672e20407010da34463acc759c162ca9734bca6"}, "metrics": [{"type": "accuracy", "value": 85.83053127101546}, {"type": "f1", "value": 82.72036139888232}, {"type": "f1_weighted", "value": 85.81759723866098}, {"type": "main_score", "value": 85.83053127101546}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (en)", "type": "mteb/amazon_massive_scenario", "config": "en", "split": "test", "revision": "fad2c6e8459f9e1c45d9315f4953d921437d70f8"}, "metrics": [{"type": "accuracy", "value": 90.19838601210489}, {"type": "f1", "value": 89.55260197964978}, {"type": "f1_weighted", "value": 90.11422965504119}, {"type": "main_score", "value": 90.19838601210489}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringP2P", "type": "mteb/medrxiv-clustering-p2p", "config": "default", "split": "test", "revision": "e7a26af6f3ae46b30dde8737f02c07b1505bcc73"}, "metrics": [{"type": "main_score", "value": 46.866746897607094}, {"type": "v_measure", "value": 46.866746897607094}, {"type": "v_measure_std", "value": 1.0966477896919726}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringS2S", "type": "mteb/medrxiv-clustering-s2s", "config": "default", "split": "test", "revision": "35191c8c0dca72d8ff3efcd72aa802307d469663"}, "metrics": [{"type": "main_score", "value": 44.6538827415503}, {"type": "v_measure", "value": 44.6538827415503}, {"type": "v_measure_std", "value": 1.1649569936599116}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB MindSmallReranking", "type": "mteb/mind_small", "config": "default", "split": "test", "revision": "59042f120c80e8afa9cdbb224f67076cec0fc9a7"}, "metrics": [{"type": "main_score", "value": 33.05449204940555}, {"type": "map", "value": 33.05449204940555}, {"type": "mrr", "value": 34.32562058439585}, {"type": "nAUC_map_diff1", "value": 11.465656013162807}, {"type": "nAUC_map_max", "value": -20.400088169502308}, {"type": "nAUC_map_std", "value": -2.638964886362445}, {"type": "nAUC_mrr_diff1", "value": 10.644290702481207}, {"type": "nAUC_mrr_max", "value": -15.304687384645769}, {"type": "nAUC_mrr_std", "value": -0.519919931348978}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NFCorpus", "type": "mteb/nfcorpus", "config": "default", "split": "test", "revision": "ec0fa4fe99da2ff19ca1214b7966684033a58814"}, "metrics": [{"type": "main_score", "value": 41.998000000000005}, {"type": "map_at_1", "value": 6.907000000000001}, {"type": "map_at_10", "value": 16.397000000000002}, {"type": "map_at_100", "value": 21.69}, {"type": "map_at_1000", "value": 23.652}, {"type": "map_at_20", "value": 18.629}, {"type": "map_at_3", "value": 11.969000000000001}, {"type": "map_at_5", "value": 13.894}, {"type": "mrr_at_1", "value": 53.25077399380805}, {"type": "mrr_at_10", "value": 61.8561108653988}, {"type": "mrr_at_100", "value": 62.42447851935404}, {"type": "mrr_at_1000", "value": 62.459626424428095}, {"type": "mrr_at_20", "value": 62.287236389990696}, {"type": "mrr_at_3", "value": 60.42311661506711}, {"type": "mrr_at_5", "value": 61.36738906088753}, {"type": "nauc_map_at_1000_diff1", "value": 17.159461939643844}, {"type": "nauc_map_at_1000_max", "value": 32.42764938789903}, {"type": "nauc_map_at_1000_std", "value": 11.039427848422093}, {"type": "nauc_map_at_100_diff1", "value": 19.089532984187503}, {"type": "nauc_map_at_100_max", "value": 31.96721085058713}, {"type": "nauc_map_at_100_std", "value": 6.947468655726444}, {"type": "nauc_map_at_10_diff1", "value": 25.77255342629802}, {"type": "nauc_map_at_10_max", "value": 26.163590320961543}, {"type": "nauc_map_at_10_std", "value": -5.2588093720998375}, {"type": "nauc_map_at_1_diff1", "value": 46.31602607957798}, {"type": "nauc_map_at_1_max", "value": 11.807757660801942}, {"type": "nauc_map_at_1_std", "value": -13.984889089354317}, {"type": "nauc_map_at_20_diff1", "value": 22.308161130465365}, {"type": "nauc_map_at_20_max", "value": 29.070587307827722}, {"type": "nauc_map_at_20_std", "value": -1.0103056620851558}, {"type": "nauc_map_at_3_diff1", "value": 33.580827849617506}, {"type": "nauc_map_at_3_max", "value": 17.661630885799042}, {"type": "nauc_map_at_3_std", "value": -11.463282544041888}, {"type": "nauc_map_at_5_diff1", "value": 30.32603342696912}, {"type": "nauc_map_at_5_max", "value": 20.938905485667245}, {"type": "nauc_map_at_5_std", "value": -10.537086968155755}, {"type": "nauc_mrr_at_1000_diff1", "value": 24.45065397805829}, {"type": "nauc_mrr_at_1000_max", "value": 48.17519860927417}, {"type": "nauc_mrr_at_1000_std", "value": 30.350767549118903}, {"type": "nauc_mrr_at_100_diff1", "value": 24.444061606534486}, {"type": "nauc_mrr_at_100_max", "value": 48.1922894212229}, {"type": "nauc_mrr_at_100_std", "value": 30.379257816584094}, {"type": "nauc_mrr_at_10_diff1", "value": 24.25598717198779}, {"type": "nauc_mrr_at_10_max", "value": 48.10437607774264}, {"type": "nauc_mrr_at_10_std", "value": 30.090202482685996}, {"type": "nauc_mrr_at_1_diff1", "value": 26.907595285201264}, {"type": "nauc_mrr_at_1_max", "value": 44.006974050369955}, {"type": "nauc_mrr_at_1_std", "value": 26.921001962861062}, {"type": "nauc_mrr_at_20_diff1", "value": 24.462771570553738}, {"type": "nauc_mrr_at_20_max", "value": 48.264688196799746}, {"type": "nauc_mrr_at_20_std", "value": 30.498095141265914}, {"type": "nauc_mrr_at_3_diff1", "value": 24.76829388237229}, {"type": "nauc_mrr_at_3_max", "value": 48.213758704739924}, {"type": "nauc_mrr_at_3_std", "value": 30.1502853918892}, {"type": "nauc_mrr_at_5_diff1", "value": 24.476494932330247}, {"type": "nauc_mrr_at_5_max", "value": 47.977250552198804}, {"type": "nauc_mrr_at_5_std", "value": 29.65248143104835}, {"type": "nauc_ndcg_at_1000_diff1", "value": 13.055818920426246}, {"type": "nauc_ndcg_at_1000_max", "value": 46.00986444256306}, {"type": "nauc_ndcg_at_1000_std", "value": 29.622662054922085}, {"type": "nauc_ndcg_at_100_diff1", "value": 12.260551238228816}, {"type": "nauc_ndcg_at_100_max", "value": 39.89783048267698}, {"type": "nauc_ndcg_at_100_std", "value": 23.806961617956613}, {"type": "nauc_ndcg_at_10_diff1", "value": 11.002915931619567}, {"type": "nauc_ndcg_at_10_max", "value": 39.79323759244374}, {"type": "nauc_ndcg_at_10_std", "value": 23.053072152911046}, {"type": "nauc_ndcg_at_1_diff1", "value": 27.560910719974434}, {"type": "nauc_ndcg_at_1_max", "value": 41.21084046258119}, {"type": "nauc_ndcg_at_1_std", "value": 26.112891742912893}, {"type": "nauc_ndcg_at_20_diff1", "value": 10.085854089024496}, {"type": "nauc_ndcg_at_20_max", "value": 37.88629173784684}, {"type": "nauc_ndcg_at_20_std", "value": 23.17664322248358}, {"type": "nauc_ndcg_at_3_diff1", "value": 16.58969583405987}, {"type": "nauc_ndcg_at_3_max", "value": 41.282222954101435}, {"type": "nauc_ndcg_at_3_std", "value": 21.080670648392747}, {"type": "nauc_ndcg_at_5_diff1", "value": 13.893127947909885}, {"type": "nauc_ndcg_at_5_max", "value": 40.21188015992804}, {"type": "nauc_ndcg_at_5_std", "value": 21.417443978842652}, {"type": "nauc_precision_at_1000_diff1", "value": -17.227504530334564}, {"type": "nauc_precision_at_1000_max", "value": 3.798554468439066}, {"type": "nauc_precision_at_1000_std", "value": 35.73617809452683}, {"type": "nauc_precision_at_100_diff1", "value": -17.63388230218776}, {"type": "nauc_precision_at_100_max", "value": 15.079399882407094}, {"type": "nauc_precision_at_100_std", "value": 41.83698491321226}, {"type": "nauc_precision_at_10_diff1", "value": -11.850925959645156}, {"type": "nauc_precision_at_10_max", "value": 35.93283968364352}, {"type": "nauc_precision_at_10_std", "value": 34.391271855921296}, {"type": "nauc_precision_at_1_diff1", "value": 27.730860778824823}, {"type": "nauc_precision_at_1_max", "value": 43.97462471516834}, {"type": "nauc_precision_at_1_std", "value": 27.491068270978896}, {"type": "nauc_precision_at_20_diff1", "value": -14.281328840943347}, {"type": "nauc_precision_at_20_max", "value": 29.469099781759006}, {"type": "nauc_precision_at_20_std", "value": 38.54703022340941}, {"type": "nauc_precision_at_3_diff1", "value": 3.486986910413196}, {"type": "nauc_precision_at_3_max", "value": 41.21107780473768}, {"type": "nauc_precision_at_3_std", "value": 24.057479124531216}, {"type": "nauc_precision_at_5_diff1", "value": -3.0623787872866233}, {"type": "nauc_precision_at_5_max", "value": 37.49266386466702}, {"type": "nauc_precision_at_5_std", "value": 26.894454268004935}, {"type": "nauc_recall_at_1000_diff1", "value": -2.446891864334283}, {"type": "nauc_recall_at_1000_max", "value": 23.867293584643377}, {"type": "nauc_recall_at_1000_std", "value": 16.34707128224595}, {"type": "nauc_recall_at_100_diff1", "value": 4.891133690841179}, {"type": "nauc_recall_at_100_max", "value": 24.56727964996522}, {"type": "nauc_recall_at_100_std", "value": 9.847212953200797}, {"type": "nauc_recall_at_10_diff1", "value": 19.211912363585288}, {"type": "nauc_recall_at_10_max", "value": 24.825344777920737}, {"type": "nauc_recall_at_10_std", "value": -5.447989195041898}, {"type": "nauc_recall_at_1_diff1", "value": 46.31602607957798}, {"type": "nauc_recall_at_1_max", "value": 11.807757660801942}, {"type": "nauc_recall_at_1_std", "value": -13.984889089354317}, {"type": "nauc_recall_at_20_diff1", "value": 12.233372054304805}, {"type": "nauc_recall_at_20_max", "value": 22.284108685207148}, {"type": "nauc_recall_at_20_std", "value": -4.317138366746209}, {"type": "nauc_recall_at_3_diff1", "value": 28.394631527225815}, {"type": "nauc_recall_at_3_max", "value": 15.593864852625462}, {"type": "nauc_recall_at_3_std", "value": -12.383531804314593}, {"type": "nauc_recall_at_5_diff1", "value": 24.457441304950343}, {"type": "nauc_recall_at_5_max", "value": 19.080049396281623}, {"type": "nauc_recall_at_5_std", "value": -11.879747703626627}, {"type": "ndcg_at_1", "value": 51.548}, {"type": "ndcg_at_10", "value": 41.998000000000005}, {"type": "ndcg_at_100", "value": 39.626}, {"type": "ndcg_at_1000", "value": 48.707}, {"type": "ndcg_at_20", "value": 40.181}, {"type": "ndcg_at_3", "value": 48.06}, {"type": "ndcg_at_5", "value": 45.829}, {"type": "precision_at_1", "value": 52.941}, {"type": "precision_at_10", "value": 31.330999999999996}, {"type": "precision_at_100", "value": 10.421}, {"type": "precision_at_1000", "value": 2.428}, {"type": "precision_at_20", "value": 24.118000000000002}, {"type": "precision_at_3", "value": 45.408}, {"type": "precision_at_5", "value": 39.938}, {"type": "recall_at_1", "value": 6.907000000000001}, {"type": "recall_at_10", "value": 20.51}, {"type": "recall_at_100", "value": 40.857}, {"type": "recall_at_1000", "value": 73.616}, {"type": "recall_at_20", "value": 26.52}, {"type": "recall_at_3", "value": 13.267999999999999}, {"type": "recall_at_5", "value": 16.141}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NQ", "type": "mteb/nq", "config": "default", "split": "test", "revision": "b774495ed302d8c44a3a7ea25c90dbce03968f31"}, "metrics": [{"type": "main_score", "value": 71.8}, {"type": "map_at_1", "value": 47.629}, {"type": "map_at_10", "value": 64.846}, {"type": "map_at_100", "value": 65.40899999999999}, {"type": "map_at_1000", "value": 65.416}, {"type": "map_at_20", "value": 65.239}, {"type": "map_at_3", "value": 61.185}, {"type": "map_at_5", "value": 63.583}, {"type": "mrr_at_1", "value": 53.15758980301275}, {"type": "mrr_at_10", "value": 67.12880961577366}, {"type": "mrr_at_100", "value": 67.44006405426018}, {"type": "mrr_at_1000", "value": 67.44519150402294}, {"type": "mrr_at_20", "value": 67.34317135515428}, {"type": "mrr_at_3", "value": 64.5905755117805}, {"type": "mrr_at_5", "value": 66.24613750482806}, {"type": "nauc_map_at_1000_diff1", "value": 45.73812106517133}, {"type": "nauc_map_at_1000_max", "value": 35.21262031755756}, {"type": "nauc_map_at_1000_std", "value": -5.549443574026027}, {"type": "nauc_map_at_100_diff1", "value": 45.74254652176879}, {"type": "nauc_map_at_100_max", "value": 35.22349167515518}, {"type": "nauc_map_at_100_std", "value": -5.53697496044773}, {"type": "nauc_map_at_10_diff1", "value": 45.62837128377087}, {"type": "nauc_map_at_10_max", "value": 35.3261562342222}, {"type": "nauc_map_at_10_std", "value": -5.761924414031163}, {"type": "nauc_map_at_1_diff1", "value": 48.69187848570499}, {"type": "nauc_map_at_1_max", "value": 28.687996096473476}, {"type": "nauc_map_at_1_std", "value": -7.518605958272523}, {"type": "nauc_map_at_20_diff1", "value": 45.702303442220035}, {"type": "nauc_map_at_20_max", "value": 35.30719944705456}, {"type": "nauc_map_at_20_std", "value": -5.59505654742681}, {"type": "nauc_map_at_3_diff1", "value": 45.376813726832474}, {"type": "nauc_map_at_3_max", "value": 34.68452149643597}, {"type": "nauc_map_at_3_std", "value": -7.329014950379634}, {"type": "nauc_map_at_5_diff1", "value": 45.29528861989316}, {"type": "nauc_map_at_5_max", "value": 35.35741440869229}, {"type": "nauc_map_at_5_std", "value": -6.028788612259288}, {"type": "nauc_mrr_at_1000_diff1", "value": 46.11808147912517}, {"type": "nauc_mrr_at_1000_max", "value": 35.59241850411947}, {"type": "nauc_mrr_at_1000_std", "value": -3.4072428526109317}, {"type": "nauc_mrr_at_100_diff1", "value": 46.121345545514046}, {"type": "nauc_mrr_at_100_max", "value": 35.60147795073431}, {"type": "nauc_mrr_at_100_std", "value": -3.3965322447588826}, {"type": "nauc_mrr_at_10_diff1", "value": 46.0920068210502}, {"type": "nauc_mrr_at_10_max", "value": 35.79649987854354}, {"type": "nauc_mrr_at_10_std", "value": -3.339624589368137}, {"type": "nauc_mrr_at_1_diff1", "value": 49.101364605656194}, {"type": "nauc_mrr_at_1_max", "value": 31.500796071482146}, {"type": "nauc_mrr_at_1_std", "value": -4.183818500718156}, {"type": "nauc_mrr_at_20_diff1", "value": 46.088076630465594}, {"type": "nauc_mrr_at_20_max", "value": 35.682131663053205}, {"type": "nauc_mrr_at_20_std", "value": -3.35939023178519}, {"type": "nauc_mrr_at_3_diff1", "value": 45.47570812708642}, {"type": "nauc_mrr_at_3_max", "value": 35.741892517632984}, {"type": "nauc_mrr_at_3_std", "value": -4.135335963822013}, {"type": "nauc_mrr_at_5_diff1", "value": 45.78903474184014}, {"type": "nauc_mrr_at_5_max", "value": 35.91273593700205}, {"type": "nauc_mrr_at_5_std", "value": -3.467873421286869}, {"type": "nauc_ndcg_at_1000_diff1", "value": 45.5056583000012}, {"type": "nauc_ndcg_at_1000_max", "value": 36.34328379251593}, {"type": "nauc_ndcg_at_1000_std", "value": -4.0759698229323345}, {"type": "nauc_ndcg_at_100_diff1", "value": 45.61918946477166}, {"type": "nauc_ndcg_at_100_max", "value": 36.675460335836235}, {"type": "nauc_ndcg_at_100_std", "value": -3.6795334726235986}, {"type": "nauc_ndcg_at_10_diff1", "value": 45.15343994274541}, {"type": "nauc_ndcg_at_10_max", "value": 37.48139242964657}, {"type": "nauc_ndcg_at_10_std", "value": -4.287039084554882}, {"type": "nauc_ndcg_at_1_diff1", "value": 49.101364605656194}, {"type": "nauc_ndcg_at_1_max", "value": 31.500796071482146}, {"type": "nauc_ndcg_at_1_std", "value": -4.183818500718156}, {"type": "nauc_ndcg_at_20_diff1", "value": 45.310026313402375}, {"type": "nauc_ndcg_at_20_max", "value": 37.32177497902133}, {"type": "nauc_ndcg_at_20_std", "value": -3.8214360391282587}, {"type": "nauc_ndcg_at_3_diff1", "value": 44.27064370528994}, {"type": "nauc_ndcg_at_3_max", "value": 36.380294033571396}, {"type": "nauc_ndcg_at_3_std", "value": -6.844263370898355}, {"type": "nauc_ndcg_at_5_diff1", "value": 44.29933499225583}, {"type": "nauc_ndcg_at_5_max", "value": 37.46477041822136}, {"type": "nauc_ndcg_at_5_std", "value": -4.866548530467956}, {"type": "nauc_precision_at_1000_diff1", "value": -14.666553359142306}, {"type": "nauc_precision_at_1000_max", "value": -0.5599759853201481}, {"type": "nauc_precision_at_1000_std", "value": 16.8370925526591}, {"type": "nauc_precision_at_100_diff1", "value": -11.816251306246278}, {"type": "nauc_precision_at_100_max", "value": 2.969819268208207}, {"type": "nauc_precision_at_100_std", "value": 18.59422946634747}, {"type": "nauc_precision_at_10_diff1", "value": 1.2050200086029401}, {"type": "nauc_precision_at_10_max", "value": 17.59930352911209}, {"type": "nauc_precision_at_10_std", "value": 13.714495717588985}, {"type": "nauc_precision_at_1_diff1", "value": 49.101364605656194}, {"type": "nauc_precision_at_1_max", "value": 31.500796071482146}, {"type": "nauc_precision_at_1_std", "value": -4.183818500718156}, {"type": "nauc_precision_at_20_diff1", "value": -5.263476664822757}, {"type": "nauc_precision_at_20_max", "value": 11.42004823600046}, {"type": "nauc_precision_at_20_std", "value": 16.510514518664994}, {"type": "nauc_precision_at_3_diff1", "value": 20.116460379305828}, {"type": "nauc_precision_at_3_max", "value": 31.32235038301311}, {"type": "nauc_precision_at_3_std", "value": 2.7486717133871923}, {"type": "nauc_precision_at_5_diff1", "value": 9.57451645335723}, {"type": "nauc_precision_at_5_max", "value": 25.28449126580587}, {"type": "nauc_precision_at_5_std", "value": 9.955736162466767}, {"type": "nauc_recall_at_1000_diff1", "value": -21.632253065978794}, {"type": "nauc_recall_at_1000_max", "value": 70.14409090958776}, {"type": "nauc_recall_at_1000_std", "value": 65.61658090892989}, {"type": "nauc_recall_at_100_diff1", "value": 51.83161124806711}, {"type": "nauc_recall_at_100_max", "value": 77.49921361841523}, {"type": "nauc_recall_at_100_std", "value": 48.352508746719444}, {"type": "nauc_recall_at_10_diff1", "value": 39.86695231362791}, {"type": "nauc_recall_at_10_max", "value": 50.12029094799474}, {"type": "nauc_recall_at_10_std", "value": 0.1650940628131058}, {"type": "nauc_recall_at_1_diff1", "value": 48.69187848570499}, {"type": "nauc_recall_at_1_max", "value": 28.687996096473476}, {"type": "nauc_recall_at_1_std", "value": -7.518605958272523}, {"type": "nauc_recall_at_20_diff1", "value": 39.14155398061627}, {"type": "nauc_recall_at_20_max", "value": 56.78559423716229}, {"type": "nauc_recall_at_20_std", "value": 7.9728224572344075}, {"type": "nauc_recall_at_3_diff1", "value": 38.69589523432158}, {"type": "nauc_recall_at_3_max", "value": 39.53271258375579}, {"type": "nauc_recall_at_3_std", "value": -8.646925065787512}, {"type": "nauc_recall_at_5_diff1", "value": 37.45922652959002}, {"type": "nauc_recall_at_5_max", "value": 44.4911958995867}, {"type": "nauc_recall_at_5_std", "value": -3.5659842556375594}, {"type": "ndcg_at_1", "value": 53.15800000000001}, {"type": "ndcg_at_10", "value": 71.8}, {"type": "ndcg_at_100", "value": 73.85199999999999}, {"type": "ndcg_at_1000", "value": 74.017}, {"type": "ndcg_at_20", "value": 72.933}, {"type": "ndcg_at_3", "value": 65.479}, {"type": "ndcg_at_5", "value": 69.182}, {"type": "precision_at_1", "value": 53.15800000000001}, {"type": "precision_at_10", "value": 10.805}, {"type": "precision_at_100", "value": 1.2}, {"type": "precision_at_1000", "value": 0.122}, {"type": "precision_at_20", "value": 5.694}, {"type": "precision_at_3", "value": 28.939999999999998}, {"type": "precision_at_5", "value": 19.641000000000002}, {"type": "recall_at_1", "value": 47.629}, {"type": "recall_at_10", "value": 90.204}, {"type": "recall_at_100", "value": 98.66}, {"type": "recall_at_1000", "value": 99.874}, {"type": "recall_at_20", "value": 94.24}, {"type": "recall_at_3", "value": 74.394}, {"type": "recall_at_5", "value": 82.711}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB QuoraRetrieval", "type": "mteb/quora", "config": "default", "split": "test", "revision": "e4e08e0b7dbe3c8700f0daef558ff32256715259"}, "metrics": [{"type": "main_score", "value": 90.025}, {"type": "map_at_1", "value": 72.222}, {"type": "map_at_10", "value": 86.58500000000001}, {"type": "map_at_100", "value": 87.176}, {"type": "map_at_1000", "value": 87.188}, {"type": "map_at_20", "value": 86.97399999999999}, {"type": "map_at_3", "value": 83.736}, {"type": "map_at_5", "value": 85.554}, {"type": "mrr_at_1", "value": 83.04}, {"type": "mrr_at_10", "value": 89.05599603174585}, {"type": "mrr_at_100", "value": 89.12398891419457}, {"type": "mrr_at_1000", "value": 89.12434072241001}, {"type": "mrr_at_20", "value": 89.10416280692111}, {"type": "mrr_at_3", "value": 88.23833333333312}, {"type": "mrr_at_5", "value": 88.82233333333308}, {"type": "nauc_map_at_1000_diff1", "value": 78.29348113313218}, {"type": "nauc_map_at_1000_max", "value": 32.31386754277228}, {"type": "nauc_map_at_1000_std", "value": -50.47543661484052}, {"type": "nauc_map_at_100_diff1", "value": 78.29618548618575}, {"type": "nauc_map_at_100_max", "value": 32.301475680947846}, {"type": "nauc_map_at_100_std", "value": -50.50303428814228}, {"type": "nauc_map_at_10_diff1", "value": 78.47383776440803}, {"type": "nauc_map_at_10_max", "value": 31.839339990133563}, {"type": "nauc_map_at_10_std", "value": -52.832713555976}, {"type": "nauc_map_at_1_diff1", "value": 82.46330147467418}, {"type": "nauc_map_at_1_max", "value": 23.497664918373538}, {"type": "nauc_map_at_1_std", "value": -43.824657665520704}, {"type": "nauc_map_at_20_diff1", "value": 78.34772176474422}, {"type": "nauc_map_at_20_max", "value": 32.16495182893947}, {"type": "nauc_map_at_20_std", "value": -51.503292726558605}, {"type": "nauc_map_at_3_diff1", "value": 79.07823813069432}, {"type": "nauc_map_at_3_max", "value": 29.395911687513976}, {"type": "nauc_map_at_3_std", "value": -54.16377546873304}, {"type": "nauc_map_at_5_diff1", "value": 78.73076619520454}, {"type": "nauc_map_at_5_max", "value": 30.700453118585237}, {"type": "nauc_map_at_5_std", "value": -54.130514177664054}, {"type": "nauc_mrr_at_1000_diff1", "value": 79.04736184471865}, {"type": "nauc_mrr_at_1000_max", "value": 34.43004593837643}, {"type": "nauc_mrr_at_1000_std", "value": -46.137269068195316}, {"type": "nauc_mrr_at_100_diff1", "value": 79.04698704288086}, {"type": "nauc_mrr_at_100_max", "value": 34.4305553741175}, {"type": "nauc_mrr_at_100_std", "value": -46.13786687786434}, {"type": "nauc_mrr_at_10_diff1", "value": 79.04490677485934}, {"type": "nauc_mrr_at_10_max", "value": 34.38170181522227}, {"type": "nauc_mrr_at_10_std", "value": -46.38129875681807}, {"type": "nauc_mrr_at_1_diff1", "value": 79.87159215719124}, {"type": "nauc_mrr_at_1_max", "value": 34.05882339253136}, {"type": "nauc_mrr_at_1_std", "value": -43.56093395137571}, {"type": "nauc_mrr_at_20_diff1", "value": 79.04384174535653}, {"type": "nauc_mrr_at_20_max", "value": 34.442136494675005}, {"type": "nauc_mrr_at_20_std", "value": -46.205458519638654}, {"type": "nauc_mrr_at_3_diff1", "value": 78.78154519155487}, {"type": "nauc_mrr_at_3_max", "value": 34.74995000500305}, {"type": "nauc_mrr_at_3_std", "value": -46.36264203155416}, {"type": "nauc_mrr_at_5_diff1", "value": 79.02631187177}, {"type": "nauc_mrr_at_5_max", "value": 34.538698249632205}, {"type": "nauc_mrr_at_5_std", "value": -46.468881576157465}, {"type": "nauc_ndcg_at_1000_diff1", "value": 78.25260097014645}, {"type": "nauc_ndcg_at_1000_max", "value": 33.68584498704271}, {"type": "nauc_ndcg_at_1000_std", "value": -48.44716779494868}, {"type": "nauc_ndcg_at_100_diff1", "value": 78.25115412256716}, {"type": "nauc_ndcg_at_100_max", "value": 33.63652663447088}, {"type": "nauc_ndcg_at_100_std", "value": -48.489243909024715}, {"type": "nauc_ndcg_at_10_diff1", "value": 78.23875101557334}, {"type": "nauc_ndcg_at_10_max", "value": 32.65217430043823}, {"type": "nauc_ndcg_at_10_std", "value": -52.57770468845309}, {"type": "nauc_ndcg_at_1_diff1", "value": 79.87159215719124}, {"type": "nauc_ndcg_at_1_max", "value": 34.05882339253136}, {"type": "nauc_ndcg_at_1_std", "value": -43.56093395137571}, {"type": "nauc_ndcg_at_20_diff1", "value": 78.23478552311765}, {"type": "nauc_ndcg_at_20_max", "value": 33.30691737901109}, {"type": "nauc_ndcg_at_20_std", "value": -50.78412614854527}, {"type": "nauc_ndcg_at_3_diff1", "value": 77.66134485470224}, {"type": "nauc_ndcg_at_3_max", "value": 32.19504710373125}, {"type": "nauc_ndcg_at_3_std", "value": -52.01636728550155}, {"type": "nauc_ndcg_at_5_diff1", "value": 78.04734137324255}, {"type": "nauc_ndcg_at_5_max", "value": 31.94593625591248}, {"type": "nauc_ndcg_at_5_std", "value": -53.02169800690546}, {"type": "nauc_precision_at_1000_diff1", "value": -45.771948123542636}, {"type": "nauc_precision_at_1000_max", "value": -5.182406190477681}, {"type": "nauc_precision_at_1000_std", "value": 41.14460438707817}, {"type": "nauc_precision_at_100_diff1", "value": -45.64767154261461}, {"type": "nauc_precision_at_100_max", "value": -5.046308286851713}, {"type": "nauc_precision_at_100_std", "value": 41.07186716587844}, {"type": "nauc_precision_at_10_diff1", "value": -42.26779562305825}, {"type": "nauc_precision_at_10_max", "value": -1.1264852893323076}, {"type": "nauc_precision_at_10_std", "value": 27.62275729822392}, {"type": "nauc_precision_at_1_diff1", "value": 79.87159215719124}, {"type": "nauc_precision_at_1_max", "value": 34.05882339253136}, {"type": "nauc_precision_at_1_std", "value": -43.56093395137571}, {"type": "nauc_precision_at_20_diff1", "value": -44.24293221128388}, {"type": "nauc_precision_at_20_max", "value": -3.1345628837361867}, {"type": "nauc_precision_at_20_std", "value": 34.23625492740366}, {"type": "nauc_precision_at_3_diff1", "value": -24.925251389823348}, {"type": "nauc_precision_at_3_max", "value": 6.622188833369412}, {"type": "nauc_precision_at_3_std", "value": 6.424741786858512}, {"type": "nauc_precision_at_5_diff1", "value": -36.1407949990387}, {"type": "nauc_precision_at_5_max", "value": 1.7533948968374462}, {"type": "nauc_precision_at_5_std", "value": 17.914083278982634}, {"type": "nauc_recall_at_1000_diff1", "value": 52.26815466244496}, {"type": "nauc_recall_at_1000_max", "value": 69.73611104239443}, {"type": "nauc_recall_at_1000_std", "value": 73.18969965863008}, {"type": "nauc_recall_at_100_diff1", "value": 70.80557513785271}, {"type": "nauc_recall_at_100_max", "value": 33.333440086544556}, {"type": "nauc_recall_at_100_std", "value": -38.75992366905504}, {"type": "nauc_recall_at_10_diff1", "value": 74.45948457438163}, {"type": "nauc_recall_at_10_max", "value": 26.64948512428989}, {"type": "nauc_recall_at_10_std", "value": -82.90334292052363}, {"type": "nauc_recall_at_1_diff1", "value": 82.46330147467418}, {"type": "nauc_recall_at_1_max", "value": 23.497664918373538}, {"type": "nauc_recall_at_1_std", "value": -43.824657665520704}, {"type": "nauc_recall_at_20_diff1", "value": 73.80140280887753}, {"type": "nauc_recall_at_20_max", "value": 30.361616426734965}, {"type": "nauc_recall_at_20_std", "value": -81.1418804447414}, {"type": "nauc_recall_at_3_diff1", "value": 75.19854736087834}, {"type": "nauc_recall_at_3_max", "value": 26.12298005045584}, {"type": "nauc_recall_at_3_std", "value": -63.42583714745169}, {"type": "nauc_recall_at_5_diff1", "value": 74.16423451950358}, {"type": "nauc_recall_at_5_max", "value": 25.552390331018987}, {"type": "nauc_recall_at_5_std", "value": -71.15891947773912}, {"type": "ndcg_at_1", "value": 83.04}, {"type": "ndcg_at_10", "value": 90.025}, {"type": "ndcg_at_100", "value": 91.006}, {"type": "ndcg_at_1000", "value": 91.061}, {"type": "ndcg_at_20", "value": 90.556}, {"type": "ndcg_at_3", "value": 87.493}, {"type": "ndcg_at_5", "value": 88.955}, {"type": "precision_at_1", "value": 83.04}, {"type": "precision_at_10", "value": 13.667000000000002}, {"type": "precision_at_100", "value": 1.542}, {"type": "precision_at_1000", "value": 0.157}, {"type": "precision_at_20", "value": 7.221}, {"type": "precision_at_3", "value": 38.433}, {"type": "precision_at_5", "value": 25.228}, {"type": "recall_at_1", "value": 72.222}, {"type": "recall_at_10", "value": 96.604}, {"type": "recall_at_100", "value": 99.786}, {"type": "recall_at_1000", "value": 99.996}, {"type": "recall_at_20", "value": 98.253}, {"type": "recall_at_3", "value": 89.276}, {"type": "recall_at_5", "value": 93.46}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClustering", "type": "mteb/reddit-clustering", "config": "default", "split": "test", "revision": "24640382cdbf8abc73003fb0fa6d111a705499eb"}, "metrics": [{"type": "main_score", "value": 72.86492101891123}, {"type": "v_measure", "value": 72.86492101891123}, {"type": "v_measure_std", "value": 2.778711445144635}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClusteringP2P", "type": "mteb/reddit-clustering-p2p", "config": "default", "split": "test", "revision": "385e3cb46b4cfa89021f56c4380204149d0efe33"}, "metrics": [{"type": "main_score", "value": 75.27316726548479}, {"type": "v_measure", "value": 75.27316726548479}, {"type": "v_measure_std", "value": 8.87871936725338}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SCIDOCS", "type": "mteb/scidocs", "config": "default", "split": "test", "revision": "f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88"}, "metrics": [{"type": "main_score", "value": 26.638}, {"type": "map_at_1", "value": 6.128}, {"type": "map_at_10", "value": 16.472}, {"type": "map_at_100", "value": 19.522000000000002}, {"type": "map_at_1000", "value": 19.898}, {"type": "map_at_20", "value": 18.098}, {"type": "map_at_3", "value": 11.283}, {"type": "map_at_5", "value": 13.771}, {"type": "mrr_at_1", "value": 30.2}, {"type": "mrr_at_10", "value": 42.621150793650735}, {"type": "mrr_at_100", "value": 43.740858712021954}, {"type": "mrr_at_1000", "value": 43.762699500220904}, {"type": "mrr_at_20", "value": 43.383639927753634}, {"type": "mrr_at_3", "value": 38.83333333333331}, {"type": "mrr_at_5", "value": 41.14833333333326}, {"type": "nauc_map_at_1000_diff1", "value": 13.13534664124808}, {"type": "nauc_map_at_1000_max", "value": 29.346654566149795}, {"type": "nauc_map_at_1000_std", "value": 18.08121186982413}, {"type": "nauc_map_at_100_diff1", "value": 13.098072728041538}, {"type": "nauc_map_at_100_max", "value": 29.299084480697523}, {"type": "nauc_map_at_100_std", "value": 17.961620202918464}, {"type": "nauc_map_at_10_diff1", "value": 14.001743720394682}, {"type": "nauc_map_at_10_max", "value": 28.04128290996403}, {"type": "nauc_map_at_10_std", "value": 13.744481555974716}, {"type": "nauc_map_at_1_diff1", "value": 22.1926640424872}, {"type": "nauc_map_at_1_max", "value": 21.32609279586034}, {"type": "nauc_map_at_1_std", "value": 6.566596302915438}, {"type": "nauc_map_at_20_diff1", "value": 13.57313142419664}, {"type": "nauc_map_at_20_max", "value": 28.93840146319476}, {"type": "nauc_map_at_20_std", "value": 16.50869367365676}, {"type": "nauc_map_at_3_diff1", "value": 17.707700541948462}, {"type": "nauc_map_at_3_max", "value": 26.058174051376238}, {"type": "nauc_map_at_3_std", "value": 9.943924560735267}, {"type": "nauc_map_at_5_diff1", "value": 17.11844492157723}, {"type": "nauc_map_at_5_max", "value": 27.865247403049388}, {"type": "nauc_map_at_5_std", "value": 11.372588172121546}, {"type": "nauc_mrr_at_1000_diff1", "value": 21.11248719936198}, {"type": "nauc_mrr_at_1000_max", "value": 26.734172102201466}, {"type": "nauc_mrr_at_1000_std", "value": 11.766121765437228}, {"type": "nauc_mrr_at_100_diff1", "value": 21.107109982277702}, {"type": "nauc_mrr_at_100_max", "value": 26.741616065723267}, {"type": "nauc_mrr_at_100_std", "value": 11.789802686224208}, {"type": "nauc_mrr_at_10_diff1", "value": 20.74108639793207}, {"type": "nauc_mrr_at_10_max", "value": 26.920838463358333}, {"type": "nauc_mrr_at_10_std", "value": 11.849217361926522}, {"type": "nauc_mrr_at_1_diff1", "value": 22.177437860573356}, {"type": "nauc_mrr_at_1_max", "value": 21.88074521417754}, {"type": "nauc_mrr_at_1_std", "value": 6.776011900101789}, {"type": "nauc_mrr_at_20_diff1", "value": 21.126633710175994}, {"type": "nauc_mrr_at_20_max", "value": 26.860736480370974}, {"type": "nauc_mrr_at_20_std", "value": 11.815411633726338}, {"type": "nauc_mrr_at_3_diff1", "value": 21.689245200066466}, {"type": "nauc_mrr_at_3_max", "value": 26.187305092831625}, {"type": "nauc_mrr_at_3_std", "value": 10.895380313134332}, {"type": "nauc_mrr_at_5_diff1", "value": 20.898811082479778}, {"type": "nauc_mrr_at_5_max", "value": 26.939217247104036}, {"type": "nauc_mrr_at_5_std", "value": 11.77832949822472}, {"type": "nauc_ndcg_at_1000_diff1", "value": 13.251184947898546}, {"type": "nauc_ndcg_at_1000_max", "value": 30.879594164526146}, {"type": "nauc_ndcg_at_1000_std", "value": 23.125206047366625}, {"type": "nauc_ndcg_at_100_diff1", "value": 12.549100649053676}, {"type": "nauc_ndcg_at_100_max", "value": 30.634680845419123}, {"type": "nauc_ndcg_at_100_std", "value": 23.296226055422984}, {"type": "nauc_ndcg_at_10_diff1", "value": 14.475144549294322}, {"type": "nauc_ndcg_at_10_max", "value": 29.450349815417336}, {"type": "nauc_ndcg_at_10_std", "value": 15.94068314781612}, {"type": "nauc_ndcg_at_1_diff1", "value": 22.177437860573356}, {"type": "nauc_ndcg_at_1_max", "value": 21.88074521417754}, {"type": "nauc_ndcg_at_1_std", "value": 6.776011900101789}, {"type": "nauc_ndcg_at_20_diff1", "value": 14.173669585802266}, {"type": "nauc_ndcg_at_20_max", "value": 30.475890854725}, {"type": "nauc_ndcg_at_20_std", "value": 19.863898148221704}, {"type": "nauc_ndcg_at_3_diff1", "value": 18.93971261196868}, {"type": "nauc_ndcg_at_3_max", "value": 27.3707298720736}, {"type": "nauc_ndcg_at_3_std", "value": 11.439810510051224}, {"type": "nauc_ndcg_at_5_diff1", "value": 17.89535958094687}, {"type": "nauc_ndcg_at_5_max", "value": 29.272740466638425}, {"type": "nauc_ndcg_at_5_std", "value": 13.402467626635909}, {"type": "nauc_precision_at_1000_diff1", "value": -3.811547048784123}, {"type": "nauc_precision_at_1000_max", "value": 22.55165337197117}, {"type": "nauc_precision_at_1000_std", "value": 35.98524999650108}, {"type": "nauc_precision_at_100_diff1", "value": 0.6474234774922896}, {"type": "nauc_precision_at_100_max", "value": 25.06920726527032}, {"type": "nauc_precision_at_100_std", "value": 32.31439698982313}, {"type": "nauc_precision_at_10_diff1", "value": 7.943127218139508}, {"type": "nauc_precision_at_10_max", "value": 28.571937636787197}, {"type": "nauc_precision_at_10_std", "value": 18.8472620918488}, {"type": "nauc_precision_at_1_diff1", "value": 22.177437860573356}, {"type": "nauc_precision_at_1_max", "value": 21.88074521417754}, {"type": "nauc_precision_at_1_std", "value": 6.776011900101789}, {"type": "nauc_precision_at_20_diff1", "value": 6.981574259607366}, {"type": "nauc_precision_at_20_max", "value": 28.986094397038727}, {"type": "nauc_precision_at_20_std", "value": 25.83129974001146}, {"type": "nauc_precision_at_3_diff1", "value": 17.197490724039355}, {"type": "nauc_precision_at_3_max", "value": 29.17569320583099}, {"type": "nauc_precision_at_3_std", "value": 13.430554945991846}, {"type": "nauc_precision_at_5_diff1", "value": 14.952364330739362}, {"type": "nauc_precision_at_5_max", "value": 31.053243354846977}, {"type": "nauc_precision_at_5_std", "value": 15.856312752807822}, {"type": "nauc_recall_at_1000_diff1", "value": -4.8224253128926975}, {"type": "nauc_recall_at_1000_max", "value": 21.3989024429911}, {"type": "nauc_recall_at_1000_std", "value": 39.152234275603604}, {"type": "nauc_recall_at_100_diff1", "value": 0.11936808422867201}, {"type": "nauc_recall_at_100_max", "value": 24.261739241957823}, {"type": "nauc_recall_at_100_std", "value": 32.62984573938928}, {"type": "nauc_recall_at_10_diff1", "value": 7.851256165018388}, {"type": "nauc_recall_at_10_max", "value": 27.936406600938746}, {"type": "nauc_recall_at_10_std", "value": 18.683634320636113}, {"type": "nauc_recall_at_1_diff1", "value": 22.1926640424872}, {"type": "nauc_recall_at_1_max", "value": 21.32609279586034}, {"type": "nauc_recall_at_1_std", "value": 6.566596302915438}, {"type": "nauc_recall_at_20_diff1", "value": 6.8107211705182165}, {"type": "nauc_recall_at_20_max", "value": 28.286284094687787}, {"type": "nauc_recall_at_20_std", "value": 25.932013268120862}, {"type": "nauc_recall_at_3_diff1", "value": 17.04156818427151}, {"type": "nauc_recall_at_3_max", "value": 28.645439108719216}, {"type": "nauc_recall_at_3_std", "value": 13.346047828494411}, {"type": "nauc_recall_at_5_diff1", "value": 14.906284329771822}, {"type": "nauc_recall_at_5_max", "value": 30.58628602415921}, {"type": "nauc_recall_at_5_std", "value": 15.755157478191755}, {"type": "ndcg_at_1", "value": 30.2}, {"type": "ndcg_at_10", "value": 26.638}, {"type": "ndcg_at_100", "value": 37.135}, {"type": "ndcg_at_1000", "value": 42.576}, {"type": "ndcg_at_20", "value": 30.75}, {"type": "ndcg_at_3", "value": 24.675}, {"type": "ndcg_at_5", "value": 21.836}, {"type": "precision_at_1", "value": 30.2}, {"type": "precision_at_10", "value": 14.06}, {"type": "precision_at_100", "value": 2.904}, {"type": "precision_at_1000", "value": 0.42}, {"type": "precision_at_20", "value": 9.4}, {"type": "precision_at_3", "value": 23.233}, {"type": "precision_at_5", "value": 19.439999999999998}, {"type": "recall_at_1", "value": 6.128}, {"type": "recall_at_10", "value": 28.471999999999998}, {"type": "recall_at_100", "value": 58.952000000000005}, {"type": "recall_at_1000", "value": 85.137}, {"type": "recall_at_20", "value": 38.17}, {"type": "recall_at_3", "value": 14.127999999999998}, {"type": "recall_at_5", "value": 19.673}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB SICK-R", "type": "mteb/sickr-sts", "config": "default", "split": "test", "revision": "20a6d6f312dd54037fe07a32d58e5e168867909d"}, "metrics": [{"type": "cosine_pearson", "value": 86.86608529160739}, {"type": "cosine_spearman", "value": 82.88625166203383}, {"type": "euclidean_pearson", "value": 84.15494418856142}, {"type": "euclidean_spearman", "value": 82.88449294676421}, {"type": "main_score", "value": 82.88625166203383}, {"type": "manhattan_pearson", "value": 84.39068623474428}, {"type": "manhattan_spearman", "value": 82.88065412169463}, {"type": "pearson", "value": 86.86608529160739}, {"type": "spearman", "value": 82.88625166203383}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS12", "type": "mteb/sts12-sts", "config": "default", "split": "test", "revision": "a0d554a64d88156834ff5ae9920b964011b16384"}, "metrics": [{"type": "cosine_pearson", "value": 87.0445014940449}, {"type": "cosine_spearman", "value": 80.0880365116599}, {"type": "euclidean_pearson", "value": 83.80250772928852}, {"type": "euclidean_spearman", "value": 80.0892465260778}, {"type": "main_score", "value": 80.0880365116599}, {"type": "manhattan_pearson", "value": 83.96793981929336}, {"type": "manhattan_spearman", "value": 80.24881789268238}, {"type": "pearson", "value": 87.0445014940449}, {"type": "spearman", "value": 80.0880365116599}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS13", "type": "mteb/sts13-sts", "config": "default", "split": "test", "revision": "7e90230a92c190f1bf69ae9002b8cea547a64cca"}, "metrics": [{"type": "cosine_pearson", "value": 89.33900828959968}, {"type": "cosine_spearman", "value": 89.68256358526733}, {"type": "euclidean_pearson", "value": 89.29188708262265}, {"type": "euclidean_spearman", "value": 89.68204344658601}, {"type": "main_score", "value": 89.68256358526733}, {"type": "manhattan_pearson", "value": 89.13996588193149}, {"type": "manhattan_spearman", "value": 89.61372804425623}, {"type": "pearson", "value": 89.33900828959968}, {"type": "spearman", "value": 89.68256358526733}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS14", "type": "mteb/sts14-sts", "config": "default", "split": "test", "revision": "6031580fec1f6af667f0bd2da0a551cf4f0b2375"}, "metrics": [{"type": "cosine_pearson", "value": 86.42029843639123}, {"type": "cosine_spearman", "value": 85.0707889220723}, {"type": "euclidean_pearson", "value": 85.75114239552562}, {"type": "euclidean_spearman", "value": 85.06858160270725}, {"type": "main_score", "value": 85.0707889220723}, {"type": "manhattan_pearson", "value": 85.86461900459038}, {"type": "manhattan_spearman", "value": 85.28671103475605}, {"type": "pearson", "value": 86.42029843639123}, {"type": "spearman", "value": 85.0707889220723}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS15", "type": "mteb/sts15-sts", "config": "default", "split": "test", "revision": "ae752c7c21bf194d8b67fd573edf7ae58183cbe3"}, "metrics": [{"type": "cosine_pearson", "value": 88.3660081271444}, {"type": "cosine_spearman", "value": 89.39375083609528}, {"type": "euclidean_pearson", "value": 89.21818482894895}, {"type": "euclidean_spearman", "value": 89.39361588875443}, {"type": "main_score", "value": 89.39375083609528}, {"type": "manhattan_pearson", "value": 89.53535068014057}, {"type": "manhattan_spearman", "value": 89.81077130567752}, {"type": "pearson", "value": 88.3660081271444}, {"type": "spearman", "value": 89.39375083609528}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS16", "type": "mteb/sts16-sts", "config": "default", "split": "test", "revision": "4d8694f8f0e0100860b497b999b3dbed754a0513"}, "metrics": [{"type": "cosine_pearson", "value": 85.60708247171874}, {"type": "cosine_spearman", "value": 87.15234952832193}, {"type": "euclidean_pearson", "value": 86.21743555548137}, {"type": "euclidean_spearman", "value": 87.14450217418016}, {"type": "main_score", "value": 87.15234952832193}, {"type": "manhattan_pearson", "value": 86.2467748746084}, {"type": "manhattan_spearman", "value": 87.2197479717654}, {"type": "pearson", "value": 85.60708247171874}, {"type": "spearman", "value": 87.15234952832193}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS17 (en-en)", "type": "mteb/sts17-crosslingual-sts", "config": "en-en", "split": "test", "revision": "faeb762787bd10488a50c8b5be4a3b82e411949c"}, "metrics": [{"type": "cosine_pearson", "value": 91.25898556808458}, {"type": "cosine_spearman", "value": 91.35372390581641}, {"type": "euclidean_pearson", "value": 91.319520321348}, {"type": "euclidean_spearman", "value": 91.30821135416925}, {"type": "main_score", "value": 91.35372390581641}, {"type": "manhattan_pearson", "value": 91.14800959939069}, {"type": "manhattan_spearman", "value": 91.09775424245629}, {"type": "pearson", "value": 91.25898556808458}, {"type": "spearman", "value": 91.35372390581641}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (en)", "type": "mteb/sts22-crosslingual-sts", "config": "en", "split": "test", "revision": "de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3"}, "metrics": [{"type": "cosine_pearson", "value": 67.61637111515797}, {"type": "cosine_spearman", "value": 68.10379096526697}, {"type": "euclidean_pearson", "value": 69.2652309491375}, {"type": "euclidean_spearman", "value": 68.18436357033228}, {"type": "main_score", "value": 68.10379096526697}, {"type": "manhattan_pearson", "value": 69.52531340510775}, {"type": "manhattan_spearman", "value": 68.17874790391862}, {"type": "pearson", "value": 67.61637111515797}, {"type": "spearman", "value": 68.10379096526697}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSBenchmark", "type": "mteb/stsbenchmark-sts", "config": "default", "split": "test", "revision": "b0fddb56ed78048fa8b90373c8a3cfc37b684831"}, "metrics": [{"type": "cosine_pearson", "value": 87.81592853782297}, {"type": "cosine_spearman", "value": 88.2302550329183}, {"type": "euclidean_pearson", "value": 88.01165144519526}, {"type": "euclidean_spearman", "value": 88.23342148890097}, {"type": "main_score", "value": 88.2302550329183}, {"type": "manhattan_pearson", "value": 88.148592564938}, {"type": "manhattan_spearman", "value": 88.49226317320988}, {"type": "pearson", "value": 87.81592853782297}, {"type": "spearman", "value": 88.2302550329183}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB SciDocsRR", "type": "mteb/scidocs-reranking", "config": "default", "split": "test", "revision": "d3c5e1fc0b855ab6097bf1cda04dd73947d7caab"}, "metrics": [{"type": "main_score", "value": 89.196009707431}, {"type": "map", "value": 89.196009707431}, {"type": "mrr", "value": 97.07198121413808}, {"type": "nAUC_map_diff1", "value": -14.066667940115352}, {"type": "nAUC_map_max", "value": 49.73702475027407}, {"type": "nAUC_map_std", "value": 64.0986775782592}, {"type": "nAUC_mrr_diff1", "value": 21.96846389417319}, {"type": "nAUC_mrr_max", "value": 86.38341077184032}, {"type": "nAUC_mrr_std", "value": 75.38945014727746}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SciFact", "type": "mteb/scifact", "config": "default", "split": "test", "revision": "0228b52cf27578f30900b9e5271d331663a030d7"}, "metrics": [{"type": "main_score", "value": 80.08999999999999}, {"type": "map_at_1", "value": 63.161}, {"type": "map_at_10", "value": 75.163}, {"type": "map_at_100", "value": 75.408}, {"type": "map_at_1000", "value": 75.409}, {"type": "map_at_20", "value": 75.332}, {"type": "map_at_3", "value": 71.839}, {"type": "map_at_5", "value": 74.32600000000001}, {"type": "mrr_at_1", "value": 66.33333333333333}, {"type": "mrr_at_10", "value": 75.95978835978836}, {"type": "mrr_at_100", "value": 76.15647881281473}, {"type": "mrr_at_1000", "value": 76.15736533763744}, {"type": "mrr_at_20", "value": 76.08557368557368}, {"type": "mrr_at_3", "value": 73.55555555555556}, {"type": "mrr_at_5", "value": 75.4888888888889}, {"type": "nauc_map_at_1000_diff1", "value": 77.31229383811176}, {"type": "nauc_map_at_1000_max", "value": 58.848319058605156}, {"type": "nauc_map_at_1000_std", "value": -14.290090263454985}, {"type": "nauc_map_at_100_diff1", "value": 77.31325400213969}, {"type": "nauc_map_at_100_max", "value": 58.848885054155275}, {"type": "nauc_map_at_100_std", "value": -14.285806618869273}, {"type": "nauc_map_at_10_diff1", "value": 77.1806705504232}, {"type": "nauc_map_at_10_max", "value": 59.02905805134415}, {"type": "nauc_map_at_10_std", "value": -14.132954900037467}, {"type": "nauc_map_at_1_diff1", "value": 81.03932970557837}, {"type": "nauc_map_at_1_max", "value": 49.02073230264529}, {"type": "nauc_map_at_1_std", "value": -22.977452975845512}, {"type": "nauc_map_at_20_diff1", "value": 77.22581364818562}, {"type": "nauc_map_at_20_max", "value": 58.90740400399768}, {"type": "nauc_map_at_20_std", "value": -14.245079150986745}, {"type": "nauc_map_at_3_diff1", "value": 76.99793243255563}, {"type": "nauc_map_at_3_max", "value": 54.9930733886623}, {"type": "nauc_map_at_3_std", "value": -19.297708446082407}, {"type": "nauc_map_at_5_diff1", "value": 77.1671608360295}, {"type": "nauc_map_at_5_max", "value": 57.27757489519526}, {"type": "nauc_map_at_5_std", "value": -15.446338357667708}, {"type": "nauc_mrr_at_1000_diff1", "value": 77.4806080821202}, {"type": "nauc_mrr_at_1000_max", "value": 60.9213776129792}, {"type": "nauc_mrr_at_1000_std", "value": -12.139599632228343}, {"type": "nauc_mrr_at_100_diff1", "value": 77.48158073865281}, {"type": "nauc_mrr_at_100_max", "value": 60.9218657185361}, {"type": "nauc_mrr_at_100_std", "value": -12.13532070453677}, {"type": "nauc_mrr_at_10_diff1", "value": 77.32428546014407}, {"type": "nauc_mrr_at_10_max", "value": 61.018407010343466}, {"type": "nauc_mrr_at_10_std", "value": -12.143193773309347}, {"type": "nauc_mrr_at_1_diff1", "value": 80.99806778887115}, {"type": "nauc_mrr_at_1_max", "value": 59.17855969530095}, {"type": "nauc_mrr_at_1_std", "value": -12.30545640831458}, {"type": "nauc_mrr_at_20_diff1", "value": 77.3811067653992}, {"type": "nauc_mrr_at_20_max", "value": 60.9648880366335}, {"type": "nauc_mrr_at_20_std", "value": -12.124066076541853}, {"type": "nauc_mrr_at_3_diff1", "value": 77.31304316321959}, {"type": "nauc_mrr_at_3_max", "value": 60.75536766404163}, {"type": "nauc_mrr_at_3_std", "value": -12.997876030849623}, {"type": "nauc_mrr_at_5_diff1", "value": 77.12952864141742}, {"type": "nauc_mrr_at_5_max", "value": 60.995943754968685}, {"type": "nauc_mrr_at_5_std", "value": -11.353447465605694}, {"type": "nauc_ndcg_at_1000_diff1", "value": 76.81788665683746}, {"type": "nauc_ndcg_at_1000_max", "value": 60.35947755262391}, {"type": "nauc_ndcg_at_1000_std", "value": -12.884942372460362}, {"type": "nauc_ndcg_at_100_diff1", "value": 76.87388230365198}, {"type": "nauc_ndcg_at_100_max", "value": 60.38813162962434}, {"type": "nauc_ndcg_at_100_std", "value": -12.64384717800478}, {"type": "nauc_ndcg_at_10_diff1", "value": 75.87713506026317}, {"type": "nauc_ndcg_at_10_max", "value": 61.39356554675667}, {"type": "nauc_ndcg_at_10_std", "value": -12.144227584144218}, {"type": "nauc_ndcg_at_1_diff1", "value": 80.99806778887115}, {"type": "nauc_ndcg_at_1_max", "value": 59.17855969530095}, {"type": "nauc_ndcg_at_1_std", "value": -12.30545640831458}, {"type": "nauc_ndcg_at_20_diff1", "value": 76.09913944506627}, {"type": "nauc_ndcg_at_20_max", "value": 61.01644448834147}, {"type": "nauc_ndcg_at_20_std", "value": -12.456209267623857}, {"type": "nauc_ndcg_at_3_diff1", "value": 75.52717946614608}, {"type": "nauc_ndcg_at_3_max", "value": 58.96433090721983}, {"type": "nauc_ndcg_at_3_std", "value": -15.849280494339556}, {"type": "nauc_ndcg_at_5_diff1", "value": 75.69026981016921}, {"type": "nauc_ndcg_at_5_max", "value": 58.924044405851326}, {"type": "nauc_ndcg_at_5_std", "value": -13.182728827923107}, {"type": "nauc_precision_at_1000_diff1", "value": -31.634022001609914}, {"type": "nauc_precision_at_1000_max", "value": 31.46271490784504}, {"type": "nauc_precision_at_1000_std", "value": 60.44801276891442}, {"type": "nauc_precision_at_100_diff1", "value": -29.722363469948103}, {"type": "nauc_precision_at_100_max", "value": 32.05464592020074}, {"type": "nauc_precision_at_100_std", "value": 60.832570595613554}, {"type": "nauc_precision_at_10_diff1", "value": -11.91731376599939}, {"type": "nauc_precision_at_10_max", "value": 45.43646553157129}, {"type": "nauc_precision_at_10_std", "value": 52.962408871791276}, {"type": "nauc_precision_at_1_diff1", "value": 80.99806778887115}, {"type": "nauc_precision_at_1_max", "value": 59.17855969530095}, {"type": "nauc_precision_at_1_std", "value": -12.30545640831458}, {"type": "nauc_precision_at_20_diff1", "value": -18.43293701721667}, {"type": "nauc_precision_at_20_max", "value": 39.53434874203934}, {"type": "nauc_precision_at_20_std", "value": 53.6291982468461}, {"type": "nauc_precision_at_3_diff1", "value": 30.84789043003892}, {"type": "nauc_precision_at_3_max", "value": 55.660727758110376}, {"type": "nauc_precision_at_3_std", "value": 17.87243920840355}, {"type": "nauc_precision_at_5_diff1", "value": 4.099395181445625}, {"type": "nauc_precision_at_5_max", "value": 50.346770968709386}, {"type": "nauc_precision_at_5_std", "value": 44.66722483255029}, {"type": "nauc_recall_at_1000_diff1", "value": NaN}, {"type": "nauc_recall_at_1000_max", "value": NaN}, {"type": "nauc_recall_at_1000_std", "value": NaN}, {"type": "nauc_recall_at_100_diff1", "value": 100.0}, {"type": "nauc_recall_at_100_max", "value": 72.2222222222207}, {"type": "nauc_recall_at_100_std", "value": 86.92810457516407}, {"type": "nauc_recall_at_10_diff1", "value": 62.18887555022005}, {"type": "nauc_recall_at_10_max", "value": 75.14339068960916}, {"type": "nauc_recall_at_10_std", "value": -1.4912631719357108}, {"type": "nauc_recall_at_1_diff1", "value": 81.03932970557837}, {"type": "nauc_recall_at_1_max", "value": 49.02073230264529}, {"type": "nauc_recall_at_1_std", "value": -22.977452975845512}, {"type": "nauc_recall_at_20_diff1", "value": 59.27414444038499}, {"type": "nauc_recall_at_20_max", "value": 76.32241302318047}, {"type": "nauc_recall_at_20_std", "value": -0.8322169447488666}, {"type": "nauc_recall_at_3_diff1", "value": 69.58783002593157}, {"type": "nauc_recall_at_3_max", "value": 55.89660919896563}, {"type": "nauc_recall_at_3_std", "value": -21.183005510917862}, {"type": "nauc_recall_at_5_diff1", "value": 65.53660499878802}, {"type": "nauc_recall_at_5_max", "value": 58.218018535135805}, {"type": "nauc_recall_at_5_std", "value": -8.328952210032455}, {"type": "ndcg_at_1", "value": 66.333}, {"type": "ndcg_at_10", "value": 80.08999999999999}, {"type": "ndcg_at_100", "value": 81.24900000000001}, {"type": "ndcg_at_1000", "value": 81.28800000000001}, {"type": "ndcg_at_20", "value": 80.625}, {"type": "ndcg_at_3", "value": 74.98700000000001}, {"type": "ndcg_at_5", "value": 78.553}, {"type": "precision_at_1", "value": 66.333}, {"type": "precision_at_10", "value": 10.667}, {"type": "precision_at_100", "value": 1.127}, {"type": "precision_at_1000", "value": 0.11299999999999999}, {"type": "precision_at_20", "value": 5.45}, {"type": "precision_at_3", "value": 29.555999999999997}, {"type": "precision_at_5", "value": 20.133000000000003}, {"type": "recall_at_1", "value": 63.161}, {"type": "recall_at_10", "value": 94.167}, {"type": "recall_at_100", "value": 99.667}, {"type": "recall_at_1000", "value": 100.0}, {"type": "recall_at_20", "value": 96.167}, {"type": "recall_at_3", "value": 80.972}, {"type": "recall_at_5", "value": 89.90599999999999}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB SprintDuplicateQuestions", "type": "mteb/sprintduplicatequestions-pairclassification", "config": "default", "split": "test", "revision": "d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46"}, "metrics": [{"type": "cosine_accuracy", "value": 99.81881188118813}, {"type": "cosine_accuracy_threshold", "value": 85.55081486701965}, {"type": "cosine_ap", "value": 96.0359661816236}, {"type": "cosine_f1", "value": 90.6584992343032}, {"type": "cosine_f1_threshold", "value": 84.82859134674072}, {"type": "cosine_precision", "value": 92.59645464025026}, {"type": "cosine_recall", "value": 88.8}, {"type": "dot_accuracy", "value": 99.81881188118813}, {"type": "dot_accuracy_threshold", "value": 84.91908311843872}, {"type": "dot_ap", "value": 96.05740121094365}, {"type": "dot_f1", "value": 90.81885856079404}, {"type": "dot_f1_threshold", "value": 83.84919166564941}, {"type": "dot_precision", "value": 90.14778325123153}, {"type": "dot_recall", "value": 91.5}, {"type": "euclidean_accuracy", "value": 99.82079207920792}, {"type": "euclidean_accuracy_threshold", "value": 54.49706315994263}, {"type": "euclidean_ap", "value": 96.03223527068818}, {"type": "euclidean_f1", "value": 90.72270630445925}, {"type": "euclidean_f1_threshold", "value": 54.49706315994263}, {"type": "euclidean_precision", "value": 93.05993690851734}, {"type": "euclidean_recall", "value": 88.5}, {"type": "main_score", "value": 96.32671902439806}, {"type": "manhattan_accuracy", "value": 99.83267326732673}, {"type": "manhattan_accuracy_threshold", "value": 3818.192672729492}, {"type": "manhattan_ap", "value": 96.32671902439806}, {"type": "manhattan_f1", "value": 91.52032112393378}, {"type": "manhattan_f1_threshold", "value": 3818.192672729492}, {"type": "manhattan_precision", "value": 91.8429003021148}, {"type": "manhattan_recall", "value": 91.2}, {"type": "max_ap", "value": 96.32671902439806}, {"type": "max_f1", "value": 91.52032112393378}, {"type": "max_precision", "value": 93.05993690851734}, {"type": "max_recall", "value": 91.5}, {"type": "similarity_accuracy", "value": 99.81881188118813}, {"type": "similarity_accuracy_threshold", "value": 85.55081486701965}, {"type": "similarity_ap", "value": 96.0359661816236}, {"type": "similarity_f1", "value": 90.6584992343032}, {"type": "similarity_f1_threshold", "value": 84.82859134674072}, {"type": "similarity_precision", "value": 92.59645464025026}, {"type": "similarity_recall", "value": 88.8}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClustering", "type": "mteb/stackexchange-clustering", "config": "default", "split": "test", "revision": "6cbc1f7b2bc0622f2e39d2c77fa502909748c259"}, "metrics": [{"type": "main_score", "value": 80.28558559137414}, {"type": "v_measure", "value": 80.28558559137414}, {"type": "v_measure_std", "value": 2.795276520287584}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClusteringP2P", "type": "mteb/stackexchange-clustering-p2p", "config": "default", "split": "test", "revision": "815ca46b2622cec33ccafc3735d572c266efdb44"}, "metrics": [{"type": "main_score", "value": 49.57135582416209}, {"type": "v_measure", "value": 49.57135582416209}, {"type": "v_measure_std", "value": 1.6414135468423754}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB StackOverflowDupQuestions", "type": "mteb/stackoverflowdupquestions-reranking", "config": "default", "split": "test", "revision": "e185fbe320c72810689fc5848eb6114e1ef5ec69"}, "metrics": [{"type": "main_score", "value": 55.253002583598644}, {"type": "map", "value": 55.253002583598644}, {"type": "mrr", "value": 56.24172396231219}, {"type": "nAUC_map_diff1", "value": 40.00053248203427}, {"type": "nAUC_map_max", "value": 10.05441740585869}, {"type": "nAUC_map_std", "value": 8.227169286387552}, {"type": "nAUC_mrr_diff1", "value": 40.250446264233744}, {"type": "nAUC_mrr_max", "value": 10.586310195339053}, {"type": "nAUC_mrr_std", "value": 8.47326494370076}]}, {"task": {"type": "Summarization"}, "dataset": {"name": "MTEB SummEval", "type": "mteb/summeval", "config": "default", "split": "test", "revision": "cda12ad7615edc362dbf25a00fdd61d3b1eaf93c"}, "metrics": [{"type": "cosine_pearson", "value": 31.19874648747059}, {"type": "cosine_spearman", "value": 31.493550648844863}, {"type": "dot_pearson", "value": 31.157847680289407}, {"type": "dot_spearman", "value": 31.575299712180538}, {"type": "main_score", "value": 31.493550648844863}, {"type": "pearson", "value": 31.19874648747059}, {"type": "spearman", "value": 31.493550648844863}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB TRECCOVID", "type": "mteb/trec-covid", "config": "default", "split": "test", "revision": "bb9466bac8153a0349341eb1b22e06409e78ef4e"}, "metrics": [{"type": "main_score", "value": 85.983}, {"type": "map_at_1", "value": 0.247}, {"type": "map_at_10", "value": 2.177}, {"type": "map_at_100", "value": 14.804}, {"type": "map_at_1000", "value": 37.045}, {"type": "map_at_20", "value": 4.12}, {"type": "map_at_3", "value": 0.7000000000000001}, {"type": "map_at_5", "value": 1.1320000000000001}, {"type": "mrr_at_1", "value": 96.0}, {"type": "mrr_at_10", "value": 98.0}, {"type": "mrr_at_100", "value": 98.0}, {"type": "mrr_at_1000", "value": 98.0}, {"type": "mrr_at_20", "value": 98.0}, {"type": "mrr_at_3", "value": 98.0}, {"type": "mrr_at_5", "value": 98.0}, {"type": "nauc_map_at_1000_diff1", "value": -0.9165125200337213}, {"type": "nauc_map_at_1000_max", "value": 40.260117798042764}, {"type": "nauc_map_at_1000_std", "value": 71.72789335831554}, {"type": "nauc_map_at_100_diff1", "value": 20.493827311583953}, {"type": "nauc_map_at_100_max", "value": 21.005742079276462}, {"type": "nauc_map_at_100_std", "value": 62.53815607831659}, {"type": "nauc_map_at_10_diff1", "value": 31.289297684528215}, {"type": "nauc_map_at_10_max", "value": 7.86554294370268}, {"type": "nauc_map_at_10_std", "value": 37.26191657133897}, {"type": "nauc_map_at_1_diff1", "value": 25.57568148849456}, {"type": "nauc_map_at_1_max", "value": -5.9767435623941445}, {"type": "nauc_map_at_1_std", "value": 30.849871717506755}, {"type": "nauc_map_at_20_diff1", "value": 30.896018204532087}, {"type": "nauc_map_at_20_max", "value": 8.667077299744314}, {"type": "nauc_map_at_20_std", "value": 41.512687168412924}, {"type": "nauc_map_at_3_diff1", "value": 29.44724521006598}, {"type": "nauc_map_at_3_max", "value": 1.597496889532064}, {"type": "nauc_map_at_3_std", "value": 32.25013773854697}, {"type": "nauc_map_at_5_diff1", "value": 27.387036605618825}, {"type": "nauc_map_at_5_max", "value": 5.402983746211454}, {"type": "nauc_map_at_5_std", "value": 33.940523962472184}, {"type": "nauc_mrr_at_1000_diff1", "value": -14.122315592903503}, {"type": "nauc_mrr_at_1000_max", "value": 33.84687208216605}, {"type": "nauc_mrr_at_1000_std", "value": 86.11111111111092}, {"type": "nauc_mrr_at_100_diff1", "value": -14.122315592903503}, {"type": "nauc_mrr_at_100_max", "value": 33.84687208216605}, {"type": "nauc_mrr_at_100_std", "value": 86.11111111111092}, {"type": "nauc_mrr_at_10_diff1", "value": -14.122315592903503}, {"type": "nauc_mrr_at_10_max", "value": 33.84687208216605}, {"type": "nauc_mrr_at_10_std", "value": 86.11111111111092}, {"type": "nauc_mrr_at_1_diff1", "value": -14.122315592903831}, {"type": "nauc_mrr_at_1_max", "value": 33.84687208216637}, {"type": "nauc_mrr_at_1_std", "value": 86.11111111111124}, {"type": "nauc_mrr_at_20_diff1", "value": -14.122315592903503}, {"type": "nauc_mrr_at_20_max", "value": 33.84687208216605}, {"type": "nauc_mrr_at_20_std", "value": 86.11111111111092}, {"type": "nauc_mrr_at_3_diff1", "value": -14.122315592903503}, {"type": "nauc_mrr_at_3_max", "value": 33.84687208216605}, {"type": "nauc_mrr_at_3_std", "value": 86.11111111111092}, {"type": "nauc_mrr_at_5_diff1", "value": -14.122315592903503}, {"type": "nauc_mrr_at_5_max", "value": 33.84687208216605}, {"type": "nauc_mrr_at_5_std", "value": 86.11111111111092}, {"type": "nauc_ndcg_at_1000_diff1", "value": 8.745907669561928}, {"type": "nauc_ndcg_at_1000_max", "value": 45.43307237994533}, {"type": "nauc_ndcg_at_1000_std", "value": 74.93357447176336}, {"type": "nauc_ndcg_at_100_diff1", "value": -3.9719350773353765}, {"type": "nauc_ndcg_at_100_max", "value": 44.43705332397461}, {"type": "nauc_ndcg_at_100_std", "value": 61.59493812371758}, {"type": "nauc_ndcg_at_10_diff1", "value": 15.230915878367348}, {"type": "nauc_ndcg_at_10_max", "value": 48.332840970836635}, {"type": "nauc_ndcg_at_10_std", "value": 46.888785065125774}, {"type": "nauc_ndcg_at_1_diff1", "value": 13.219732337379442}, {"type": "nauc_ndcg_at_1_max", "value": 45.19919078742603}, {"type": "nauc_ndcg_at_1_std", "value": 64.68253968253977}, {"type": "nauc_ndcg_at_20_diff1", "value": 12.479648691964865}, {"type": "nauc_ndcg_at_20_max", "value": 48.76688248450331}, {"type": "nauc_ndcg_at_20_std", "value": 51.450399755887545}, {"type": "nauc_ndcg_at_3_diff1", "value": 6.165414201871464}, {"type": "nauc_ndcg_at_3_max", "value": 45.089689347691035}, {"type": "nauc_ndcg_at_3_std", "value": 41.08249161845213}, {"type": "nauc_ndcg_at_5_diff1", "value": 7.411245806844721}, {"type": "nauc_ndcg_at_5_max", "value": 47.818748093538076}, {"type": "nauc_ndcg_at_5_std", "value": 45.907685763676575}, {"type": "nauc_precision_at_1000_diff1", "value": -30.574290219847345}, {"type": "nauc_precision_at_1000_max", "value": 32.56926126118719}, {"type": "nauc_precision_at_1000_std", "value": 14.584504392628874}, {"type": "nauc_precision_at_100_diff1", "value": -10.199740234718847}, {"type": "nauc_precision_at_100_max", "value": 41.0213226769777}, {"type": "nauc_precision_at_100_std", "value": 56.975760776771324}, {"type": "nauc_precision_at_10_diff1", "value": 7.865792689701161}, {"type": "nauc_precision_at_10_max", "value": 52.00432275201737}, {"type": "nauc_precision_at_10_std", "value": 43.89512276413724}, {"type": "nauc_precision_at_1_diff1", "value": -14.122315592903831}, {"type": "nauc_precision_at_1_max", "value": 33.84687208216637}, {"type": "nauc_precision_at_1_std", "value": 86.11111111111124}, {"type": "nauc_precision_at_20_diff1", "value": 5.481424191880084}, {"type": "nauc_precision_at_20_max", "value": 46.86629331792725}, {"type": "nauc_precision_at_20_std", "value": 49.245692667517496}, {"type": "nauc_precision_at_3_diff1", "value": -5.870408807869163}, {"type": "nauc_precision_at_3_max", "value": 48.73657612128875}, {"type": "nauc_precision_at_3_std", "value": 41.15152062088262}, {"type": "nauc_precision_at_5_diff1", "value": -4.550610529125413}, {"type": "nauc_precision_at_5_max", "value": 60.390115878205386}, {"type": "nauc_precision_at_5_std", "value": 44.16494295055696}, {"type": "nauc_recall_at_1000_diff1", "value": 8.047794367079034}, {"type": "nauc_recall_at_1000_max", "value": 37.07551482870489}, {"type": "nauc_recall_at_1000_std", "value": 66.20862163364201}, {"type": "nauc_recall_at_100_diff1", "value": 25.08104923597475}, {"type": "nauc_recall_at_100_max", "value": 9.971294642165734}, {"type": "nauc_recall_at_100_std", "value": 51.737814074891254}, {"type": "nauc_recall_at_10_diff1", "value": 32.33148478369628}, {"type": "nauc_recall_at_10_max", "value": 1.3767192150014917}, {"type": "nauc_recall_at_10_std", "value": 30.801926742876308}, {"type": "nauc_recall_at_1_diff1", "value": 25.57568148849456}, {"type": "nauc_recall_at_1_max", "value": -5.9767435623941445}, {"type": "nauc_recall_at_1_std", "value": 30.849871717506755}, {"type": "nauc_recall_at_20_diff1", "value": 31.716580022934654}, {"type": "nauc_recall_at_20_max", "value": -0.1281270579464631}, {"type": "nauc_recall_at_20_std", "value": 33.76185294993676}, {"type": "nauc_recall_at_3_diff1", "value": 29.758810004388348}, {"type": "nauc_recall_at_3_max", "value": -1.9442985017191816}, {"type": "nauc_recall_at_3_std", "value": 27.45550076962206}, {"type": "nauc_recall_at_5_diff1", "value": 27.047710181576672}, {"type": "nauc_recall_at_5_max", "value": 1.5237000700880248}, {"type": "nauc_recall_at_5_std", "value": 28.235297950159698}, {"type": "ndcg_at_1", "value": 94.0}, {"type": "ndcg_at_10", "value": 85.983}, {"type": "ndcg_at_100", "value": 69.195}, {"type": "ndcg_at_1000", "value": 62.541000000000004}, {"type": "ndcg_at_20", "value": 83.405}, {"type": "ndcg_at_3", "value": 89.98899999999999}, {"type": "ndcg_at_5", "value": 87.905}, {"type": "precision_at_1", "value": 96.0}, {"type": "precision_at_10", "value": 89.4}, {"type": "precision_at_100", "value": 71.54}, {"type": "precision_at_1000", "value": 27.594}, {"type": "precision_at_20", "value": 87.2}, {"type": "precision_at_3", "value": 92.667}, {"type": "precision_at_5", "value": 90.8}, {"type": "recall_at_1", "value": 0.247}, {"type": "recall_at_10", "value": 2.315}, {"type": "recall_at_100", "value": 17.574}, {"type": "recall_at_1000", "value": 59.336999999999996}, {"type": "recall_at_20", "value": 4.491}, {"type": "recall_at_3", "value": 0.7250000000000001}, {"type": "recall_at_5", "value": 1.1820000000000002}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB Touche2020", "type": "mteb/touche2020", "config": "default", "split": "test", "revision": "a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f"}, "metrics": [{"type": "main_score", "value": 29.944}, {"type": "map_at_1", "value": 3.064}, {"type": "map_at_10", "value": 11.501999999999999}, {"type": "map_at_100", "value": 18.736}, {"type": "map_at_1000", "value": 20.333000000000002}, {"type": "map_at_20", "value": 14.057}, {"type": "map_at_3", "value": 6.300999999999999}, {"type": "map_at_5", "value": 8.463}, {"type": "mrr_at_1", "value": 44.89795918367347}, {"type": "mrr_at_10", "value": 58.41188856494979}, {"type": "mrr_at_100", "value": 58.93964266413245}, {"type": "mrr_at_1000", "value": 58.93964266413245}, {"type": "mrr_at_20", "value": 58.767485349118}, {"type": "mrr_at_3", "value": 54.42176870748299}, {"type": "mrr_at_5", "value": 56.666666666666664}, {"type": "nauc_map_at_1000_diff1", "value": 11.478593385608479}, {"type": "nauc_map_at_1000_max", "value": 10.309889845044324}, {"type": "nauc_map_at_1000_std", "value": 21.16721939940238}, {"type": "nauc_map_at_100_diff1", "value": 11.570438543562418}, {"type": "nauc_map_at_100_max", "value": 8.426183648064834}, {"type": "nauc_map_at_100_std", "value": 18.56231985033613}, {"type": "nauc_map_at_10_diff1", "value": 22.37735506247481}, {"type": "nauc_map_at_10_max", "value": 5.455946239060806}, {"type": "nauc_map_at_10_std", "value": -4.2848826518388154}, {"type": "nauc_map_at_1_diff1", "value": 27.853645380676824}, {"type": "nauc_map_at_1_max", "value": 7.30739948053113}, {"type": "nauc_map_at_1_std", "value": -0.2773663157814586}, {"type": "nauc_map_at_20_diff1", "value": 14.724669779924648}, {"type": "nauc_map_at_20_max", "value": 10.12882779173533}, {"type": "nauc_map_at_20_std", "value": 4.4803777672120875}, {"type": "nauc_map_at_3_diff1", "value": 31.891173385921263}, {"type": "nauc_map_at_3_max", "value": 4.889652271827218}, {"type": "nauc_map_at_3_std", "value": -9.477460238651643}, {"type": "nauc_map_at_5_diff1", "value": 31.489012040465003}, {"type": "nauc_map_at_5_max", "value": 1.7330092417337482}, {"type": "nauc_map_at_5_std", "value": -8.137018608469637}, {"type": "nauc_mrr_at_1000_diff1", "value": 24.411522237082416}, {"type": "nauc_mrr_at_1000_max", "value": 11.286971076556688}, {"type": "nauc_mrr_at_1000_std", "value": 23.443174210894043}, {"type": "nauc_mrr_at_100_diff1", "value": 24.411522237082416}, {"type": "nauc_mrr_at_100_max", "value": 11.286971076556688}, {"type": "nauc_mrr_at_100_std", "value": 23.443174210894043}, {"type": "nauc_mrr_at_10_diff1", "value": 23.948152308265186}, {"type": "nauc_mrr_at_10_max", "value": 12.22420979621155}, {"type": "nauc_mrr_at_10_std", "value": 23.557939024705544}, {"type": "nauc_mrr_at_1_diff1", "value": 17.902334894536107}, {"type": "nauc_mrr_at_1_max", "value": 17.36969662861018}, {"type": "nauc_mrr_at_1_std", "value": 19.425714969048734}, {"type": "nauc_mrr_at_20_diff1", "value": 24.635893795899797}, {"type": "nauc_mrr_at_20_max", "value": 11.330541067194913}, {"type": "nauc_mrr_at_20_std", "value": 23.74518583400233}, {"type": "nauc_mrr_at_3_diff1", "value": 25.045536328282587}, {"type": "nauc_mrr_at_3_max", "value": 7.497967004732733}, {"type": "nauc_mrr_at_3_std", "value": 24.167153007320078}, {"type": "nauc_mrr_at_5_diff1", "value": 24.328479930592454}, {"type": "nauc_mrr_at_5_max", "value": 10.037126854938336}, {"type": "nauc_mrr_at_5_std", "value": 25.236208055346136}, {"type": "nauc_ndcg_at_1000_diff1", "value": 15.555347444667389}, {"type": "nauc_ndcg_at_1000_max", "value": 13.356591700655718}, {"type": "nauc_ndcg_at_1000_std", "value": 42.42395845935052}, {"type": "nauc_ndcg_at_100_diff1", "value": 13.110526060413708}, {"type": "nauc_ndcg_at_100_max", "value": 3.140006440162515}, {"type": "nauc_ndcg_at_100_std", "value": 39.02733288398033}, {"type": "nauc_ndcg_at_10_diff1", "value": 20.68853369009725}, {"type": "nauc_ndcg_at_10_max", "value": 2.435389817058852}, {"type": "nauc_ndcg_at_10_std", "value": 10.038202768784316}, {"type": "nauc_ndcg_at_1_diff1", "value": 20.17287594582385}, {"type": "nauc_ndcg_at_1_max", "value": 12.487205168273196}, {"type": "nauc_ndcg_at_1_std", "value": 20.639827614373075}, {"type": "nauc_ndcg_at_20_diff1", "value": 16.987577348502985}, {"type": "nauc_ndcg_at_20_max", "value": 2.9978717644469266}, {"type": "nauc_ndcg_at_20_std", "value": 13.015690866750354}, {"type": "nauc_ndcg_at_3_diff1", "value": 32.392223079245575}, {"type": "nauc_ndcg_at_3_max", "value": 1.587587110582544}, {"type": "nauc_ndcg_at_3_std", "value": 12.850592473446609}, {"type": "nauc_ndcg_at_5_diff1", "value": 32.80244517369626}, {"type": "nauc_ndcg_at_5_max", "value": 5.8939933777508084}, {"type": "nauc_ndcg_at_5_std", "value": 15.779687411463414}, {"type": "nauc_precision_at_1000_diff1", "value": -14.314031720452537}, {"type": "nauc_precision_at_1000_max", "value": 32.87886666567266}, {"type": "nauc_precision_at_1000_std", "value": 21.49347046886851}, {"type": "nauc_precision_at_100_diff1", "value": -9.4034008613839}, {"type": "nauc_precision_at_100_max", "value": 16.784075123309645}, {"type": "nauc_precision_at_100_std", "value": 73.14688535393604}, {"type": "nauc_precision_at_10_diff1", "value": 6.855101404043058}, {"type": "nauc_precision_at_10_max", "value": 6.52491228645612}, {"type": "nauc_precision_at_10_std", "value": 16.104602266016744}, {"type": "nauc_precision_at_1_diff1", "value": 17.902334894536107}, {"type": "nauc_precision_at_1_max", "value": 17.36969662861018}, {"type": "nauc_precision_at_1_std", "value": 19.425714969048734}, {"type": "nauc_precision_at_20_diff1", "value": -5.337534613602212}, {"type": "nauc_precision_at_20_max", "value": 17.722925454767218}, {"type": "nauc_precision_at_20_std", "value": 34.26680462132849}, {"type": "nauc_precision_at_3_diff1", "value": 31.054623397809255}, {"type": "nauc_precision_at_3_max", "value": -0.92038600946826}, {"type": "nauc_precision_at_3_std", "value": 8.326997076862916}, {"type": "nauc_precision_at_5_diff1", "value": 29.784942296920462}, {"type": "nauc_precision_at_5_max", "value": 6.337469263434779}, {"type": "nauc_precision_at_5_std", "value": 12.789597196020974}, {"type": "nauc_recall_at_1000_diff1", "value": -3.8177981862041364}, {"type": "nauc_recall_at_1000_max", "value": 14.206064332229163}, {"type": "nauc_recall_at_1000_std", "value": 74.18853420771269}, {"type": "nauc_recall_at_100_diff1", "value": 0.7677996771461106}, {"type": "nauc_recall_at_100_max", "value": -4.139924106878441}, {"type": "nauc_recall_at_100_std", "value": 48.319930706362896}, {"type": "nauc_recall_at_10_diff1", "value": 12.038835537494322}, {"type": "nauc_recall_at_10_max", "value": -2.0498983557854418}, {"type": "nauc_recall_at_10_std", "value": -2.0339180690854493}, {"type": "nauc_recall_at_1_diff1", "value": 27.853645380676824}, {"type": "nauc_recall_at_1_max", "value": 7.30739948053113}, {"type": "nauc_recall_at_1_std", "value": -0.2773663157814586}, {"type": "nauc_recall_at_20_diff1", "value": 0.7907893667756708}, {"type": "nauc_recall_at_20_max", "value": 0.8795499810558195}, {"type": "nauc_recall_at_20_std", "value": 11.512483291688282}, {"type": "nauc_recall_at_3_diff1", "value": 33.19440392639576}, {"type": "nauc_recall_at_3_max", "value": -1.5494237697432613}, {"type": "nauc_recall_at_3_std", "value": -8.560408808376984}, {"type": "nauc_recall_at_5_diff1", "value": 27.42193873870941}, {"type": "nauc_recall_at_5_max", "value": -4.74350293281128}, {"type": "nauc_recall_at_5_std", "value": -7.618060131179654}, {"type": "ndcg_at_1", "value": 42.857}, {"type": "ndcg_at_10", "value": 29.944}, {"type": "ndcg_at_100", "value": 42.624}, {"type": "ndcg_at_1000", "value": 53.384}, {"type": "ndcg_at_20", "value": 30.135}, {"type": "ndcg_at_3", "value": 34.847}, {"type": "ndcg_at_5", "value": 32.573}, {"type": "precision_at_1", "value": 44.897999999999996}, {"type": "precision_at_10", "value": 25.306}, {"type": "precision_at_100", "value": 8.694}, {"type": "precision_at_1000", "value": 1.616}, {"type": "precision_at_20", "value": 19.082}, {"type": "precision_at_3", "value": 34.014}, {"type": "precision_at_5", "value": 31.019999999999996}, {"type": "recall_at_1", "value": 3.064}, {"type": "recall_at_10", "value": 17.849999999999998}, {"type": "recall_at_100", "value": 53.217999999999996}, {"type": "recall_at_1000", "value": 87.095}, {"type": "recall_at_20", "value": 26.111}, {"type": "recall_at_3", "value": 7.383000000000001}, {"type": "recall_at_5", "value": 11.434}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ToxicConversationsClassification", "type": "mteb/toxic_conversations_50k", "config": "default", "split": "test", "revision": "edfaf9da55d3dd50d43143d90c1ac476895ae6de"}, "metrics": [{"type": "accuracy", "value": 88.759765625}, {"type": "ap", "value": 36.49152357863017}, {"type": "ap_weighted", "value": 36.49152357863017}, {"type": "f1", "value": 74.4692714448641}, {"type": "f1_weighted", "value": 90.54372649306606}, {"type": "main_score", "value": 88.759765625}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB TweetSentimentExtractionClassification", "type": "mteb/tweet_sentiment_extraction", "config": "default", "split": "test", "revision": "d604517c81ca91fe16a244d1248fc021f9ecee7a"}, "metrics": [{"type": "accuracy", "value": 74.8443689869836}, {"type": "f1", "value": 75.1139662898148}, {"type": "f1_weighted", "value": 74.7369003946243}, {"type": "main_score", "value": 74.8443689869836}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB TwentyNewsgroupsClustering", "type": "mteb/twentynewsgroups-clustering", "config": "default", "split": "test", "revision": "6125ec4e24fa026cec8a478383ee943acfbd5449"}, "metrics": [{"type": "main_score", "value": 61.42918790942448}, {"type": "v_measure", "value": 61.42918790942448}, {"type": "v_measure_std", "value": 1.0156550098843082}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterSemEval2015", "type": "mteb/twittersemeval2015-pairclassification", "config": "default", "split": "test", "revision": "70970daeab8776df92f5ea462b6173c0b46fd2d1"}, "metrics": [{"type": "cosine_accuracy", "value": 88.22197055492639}, {"type": "cosine_accuracy_threshold", "value": 83.30042362213135}, {"type": "cosine_ap", "value": 80.57754959194938}, {"type": "cosine_f1", "value": 73.70579190158894}, {"type": "cosine_f1_threshold", "value": 81.04978799819946}, {"type": "cosine_precision", "value": 71.64922770303936}, {"type": "cosine_recall", "value": 75.8839050131926}, {"type": "dot_accuracy", "value": 88.23985217857782}, {"type": "dot_accuracy_threshold", "value": 83.31039547920227}, {"type": "dot_ap", "value": 80.57533213448181}, {"type": "dot_f1", "value": 73.61309601143302}, {"type": "dot_f1_threshold", "value": 81.33968114852905}, {"type": "dot_precision", "value": 72.51087791144101}, {"type": "dot_recall", "value": 74.74934036939314}, {"type": "euclidean_accuracy", "value": 88.22197055492639}, {"type": "euclidean_accuracy_threshold", "value": 58.290231227874756}, {"type": "euclidean_ap", "value": 80.57982723880139}, {"type": "euclidean_f1", "value": 73.63426519620417}, {"type": "euclidean_f1_threshold", "value": 61.55576705932617}, {"type": "euclidean_precision", "value": 71.63173652694611}, {"type": "euclidean_recall", "value": 75.75197889182058}, {"type": "main_score", "value": 80.57982723880139}, {"type": "manhattan_accuracy", "value": 88.14448351910353}, {"type": "manhattan_accuracy_threshold", "value": 3907.2471618652344}, {"type": "manhattan_ap", "value": 80.3538079655539}, {"type": "manhattan_f1", "value": 73.40466675261054}, {"type": "manhattan_f1_threshold", "value": 4103.794097900391}, {"type": "manhattan_precision", "value": 71.76707839677337}, {"type": "manhattan_recall", "value": 75.11873350923483}, {"type": "max_ap", "value": 80.57982723880139}, {"type": "max_f1", "value": 73.70579190158894}, {"type": "max_precision", "value": 72.51087791144101}, {"type": "max_recall", "value": 75.8839050131926}, {"type": "similarity_accuracy", "value": 88.22197055492639}, {"type": "similarity_accuracy_threshold", "value": 83.30042362213135}, {"type": "similarity_ap", "value": 80.57754959194938}, {"type": "similarity_f1", "value": 73.70579190158894}, {"type": "similarity_f1_threshold", "value": 81.04978799819946}, {"type": "similarity_precision", "value": 71.64922770303936}, {"type": "similarity_recall", "value": 75.8839050131926}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterURLCorpus", "type": "mteb/twitterurlcorpus-pairclassification", "config": "default", "split": "test", "revision": "8b6510b0b1fa4e4c4f879467980e9be563ec1cdf"}, "metrics": [{"type": "cosine_accuracy", "value": 89.88628866379477}, {"type": "cosine_accuracy_threshold", "value": 80.8050274848938}, {"type": "cosine_ap", "value": 87.57594591596816}, {"type": "cosine_f1", "value": 80.0812257707218}, {"type": "cosine_f1_threshold", "value": 77.990061044693}, {"type": "cosine_precision", "value": 76.93126197063205}, {"type": "cosine_recall", "value": 83.50015398829689}, {"type": "dot_accuracy", "value": 89.87852679784221}, {"type": "dot_accuracy_threshold", "value": 80.84419965744019}, {"type": "dot_ap", "value": 87.56136742222151}, {"type": "dot_f1", "value": 80.05898617511521}, {"type": "dot_f1_threshold", "value": 77.92385816574097}, {"type": "dot_precision", "value": 76.80554573106035}, {"type": "dot_recall", "value": 83.60024638127503}, {"type": "euclidean_accuracy", "value": 89.86882446540149}, {"type": "euclidean_accuracy_threshold", "value": 62.08193898200989}, {"type": "euclidean_ap", "value": 87.57517549192228}, {"type": "euclidean_f1", "value": 80.05286925872892}, {"type": "euclidean_f1_threshold", "value": 66.65036082267761}, {"type": "euclidean_precision", "value": 76.51063232507545}, {"type": "euclidean_recall", "value": 83.93902063443178}, {"type": "main_score", "value": 87.64162614197194}, {"type": "manhattan_accuracy", "value": 89.8959909962355}, {"type": "manhattan_accuracy_threshold", "value": 4176.108169555664}, {"type": "manhattan_ap", "value": 87.64162614197194}, {"type": "manhattan_f1", "value": 80.17116279069768}, {"type": "manhattan_f1_threshold", "value": 4433.153533935547}, {"type": "manhattan_precision", "value": 77.57615035644848}, {"type": "manhattan_recall", "value": 82.94579611949491}, {"type": "max_ap", "value": 87.64162614197194}, {"type": "max_f1", "value": 80.17116279069768}, {"type": "max_precision", "value": 77.57615035644848}, {"type": "max_recall", "value": 83.93902063443178}, {"type": "similarity_accuracy", "value": 89.88628866379477}, {"type": "similarity_accuracy_threshold", "value": 80.8050274848938}, {"type": "similarity_ap", "value": 87.57594591596816}, {"type": "similarity_f1", "value": 80.0812257707218}, {"type": "similarity_f1_threshold", "value": 77.990061044693}, {"type": "similarity_precision", "value": 76.93126197063205}, {"type": "similarity_recall", "value": 83.50015398829689}]}]}]} |
sunzx0810/gte-Qwen2-7B-instruct-Q5_K_M-GGUF | sunzx0810 | sentence-similarity | [
"sentence-transformers",
"gguf",
"qwen2",
"text-generation",
"mteb",
"transformers",
"Qwen2",
"sentence-similarity",
"llama-cpp",
"gguf-my-repo",
"custom_code",
"base_model:Alibaba-NLP/gte-Qwen2-7B-instruct",
"base_model:quantized:Alibaba-NLP/gte-Qwen2-7B-instruct",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us",
"conversational"
]
| 2024-06-20T03:38:41 | 2024-06-25T07:02:31 | 114 | 6 | ---
base_model: Alibaba-NLP/gte-Qwen2-7B-instruct
license: apache-2.0
tags:
- mteb
- sentence-transformers
- transformers
- Qwen2
- sentence-similarity
- llama-cpp
- gguf-my-repo
model-index:
- name: gte-qwen2-7B-instruct
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 91.31343283582089
- type: ap
value: 67.64251402604096
- type: f1
value: 87.53372530755692
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 97.497825
- type: ap
value: 96.30329547047529
- type: f1
value: 97.49769793778039
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 62.564
- type: f1
value: 60.975777935041066
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: mteb/arguana
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 36.486000000000004
- type: map_at_10
value: 54.842
- type: map_at_100
value: 55.206999999999994
- type: map_at_1000
value: 55.206999999999994
- type: map_at_3
value: 49.893
- type: map_at_5
value: 53.105000000000004
- type: mrr_at_1
value: 37.34
- type: mrr_at_10
value: 55.143
- type: mrr_at_100
value: 55.509
- type: mrr_at_1000
value: 55.509
- type: mrr_at_3
value: 50.212999999999994
- type: mrr_at_5
value: 53.432
- type: ndcg_at_1
value: 36.486000000000004
- type: ndcg_at_10
value: 64.273
- type: ndcg_at_100
value: 65.66199999999999
- type: ndcg_at_1000
value: 65.66199999999999
- type: ndcg_at_3
value: 54.352999999999994
- type: ndcg_at_5
value: 60.131
- type: precision_at_1
value: 36.486000000000004
- type: precision_at_10
value: 9.395000000000001
- type: precision_at_100
value: 0.996
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 22.428
- type: precision_at_5
value: 16.259
- type: recall_at_1
value: 36.486000000000004
- type: recall_at_10
value: 93.95400000000001
- type: recall_at_100
value: 99.644
- type: recall_at_1000
value: 99.644
- type: recall_at_3
value: 67.283
- type: recall_at_5
value: 81.294
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 56.461169803700564
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 51.73600434466286
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 67.57827065898053
- type: mrr
value: 79.08136569493911
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 83.53324575999243
- type: cos_sim_spearman
value: 81.37173362822374
- type: euclidean_pearson
value: 82.19243335103444
- type: euclidean_spearman
value: 81.33679307304334
- type: manhattan_pearson
value: 82.38752665975699
- type: manhattan_spearman
value: 81.31510583189689
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 87.56818181818181
- type: f1
value: 87.25826722019875
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 50.09239610327673
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 46.64733054606282
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 33.997
- type: map_at_10
value: 48.176
- type: map_at_100
value: 49.82
- type: map_at_1000
value: 49.924
- type: map_at_3
value: 43.626
- type: map_at_5
value: 46.275
- type: mrr_at_1
value: 42.059999999999995
- type: mrr_at_10
value: 53.726
- type: mrr_at_100
value: 54.398
- type: mrr_at_1000
value: 54.416
- type: mrr_at_3
value: 50.714999999999996
- type: mrr_at_5
value: 52.639
- type: ndcg_at_1
value: 42.059999999999995
- type: ndcg_at_10
value: 55.574999999999996
- type: ndcg_at_100
value: 60.744
- type: ndcg_at_1000
value: 61.85699999999999
- type: ndcg_at_3
value: 49.363
- type: ndcg_at_5
value: 52.44
- type: precision_at_1
value: 42.059999999999995
- type: precision_at_10
value: 11.101999999999999
- type: precision_at_100
value: 1.73
- type: precision_at_1000
value: 0.218
- type: precision_at_3
value: 24.464
- type: precision_at_5
value: 18.026
- type: recall_at_1
value: 33.997
- type: recall_at_10
value: 70.35900000000001
- type: recall_at_100
value: 91.642
- type: recall_at_1000
value: 97.977
- type: recall_at_3
value: 52.76
- type: recall_at_5
value: 61.148
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 35.884
- type: map_at_10
value: 48.14
- type: map_at_100
value: 49.5
- type: map_at_1000
value: 49.63
- type: map_at_3
value: 44.646
- type: map_at_5
value: 46.617999999999995
- type: mrr_at_1
value: 44.458999999999996
- type: mrr_at_10
value: 53.751000000000005
- type: mrr_at_100
value: 54.37800000000001
- type: mrr_at_1000
value: 54.415
- type: mrr_at_3
value: 51.815
- type: mrr_at_5
value: 52.882
- type: ndcg_at_1
value: 44.458999999999996
- type: ndcg_at_10
value: 54.157
- type: ndcg_at_100
value: 58.362
- type: ndcg_at_1000
value: 60.178
- type: ndcg_at_3
value: 49.661
- type: ndcg_at_5
value: 51.74999999999999
- type: precision_at_1
value: 44.458999999999996
- type: precision_at_10
value: 10.248
- type: precision_at_100
value: 1.5890000000000002
- type: precision_at_1000
value: 0.207
- type: precision_at_3
value: 23.928
- type: precision_at_5
value: 16.878999999999998
- type: recall_at_1
value: 35.884
- type: recall_at_10
value: 64.798
- type: recall_at_100
value: 82.345
- type: recall_at_1000
value: 93.267
- type: recall_at_3
value: 51.847
- type: recall_at_5
value: 57.601
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 39.383
- type: map_at_10
value: 53.714
- type: map_at_100
value: 54.838
- type: map_at_1000
value: 54.87800000000001
- type: map_at_3
value: 50.114999999999995
- type: map_at_5
value: 52.153000000000006
- type: mrr_at_1
value: 45.016
- type: mrr_at_10
value: 56.732000000000006
- type: mrr_at_100
value: 57.411
- type: mrr_at_1000
value: 57.431
- type: mrr_at_3
value: 54.044000000000004
- type: mrr_at_5
value: 55.639
- type: ndcg_at_1
value: 45.016
- type: ndcg_at_10
value: 60.228
- type: ndcg_at_100
value: 64.277
- type: ndcg_at_1000
value: 65.07
- type: ndcg_at_3
value: 54.124
- type: ndcg_at_5
value: 57.147000000000006
- type: precision_at_1
value: 45.016
- type: precision_at_10
value: 9.937
- type: precision_at_100
value: 1.288
- type: precision_at_1000
value: 0.13899999999999998
- type: precision_at_3
value: 24.471999999999998
- type: precision_at_5
value: 16.991
- type: recall_at_1
value: 39.383
- type: recall_at_10
value: 76.175
- type: recall_at_100
value: 93.02
- type: recall_at_1000
value: 98.60900000000001
- type: recall_at_3
value: 60.265
- type: recall_at_5
value: 67.46600000000001
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 27.426000000000002
- type: map_at_10
value: 37.397000000000006
- type: map_at_100
value: 38.61
- type: map_at_1000
value: 38.678000000000004
- type: map_at_3
value: 34.150999999999996
- type: map_at_5
value: 36.137
- type: mrr_at_1
value: 29.944
- type: mrr_at_10
value: 39.654
- type: mrr_at_100
value: 40.638000000000005
- type: mrr_at_1000
value: 40.691
- type: mrr_at_3
value: 36.817
- type: mrr_at_5
value: 38.524
- type: ndcg_at_1
value: 29.944
- type: ndcg_at_10
value: 43.094
- type: ndcg_at_100
value: 48.789
- type: ndcg_at_1000
value: 50.339999999999996
- type: ndcg_at_3
value: 36.984
- type: ndcg_at_5
value: 40.248
- type: precision_at_1
value: 29.944
- type: precision_at_10
value: 6.78
- type: precision_at_100
value: 1.024
- type: precision_at_1000
value: 0.11800000000000001
- type: precision_at_3
value: 15.895000000000001
- type: precision_at_5
value: 11.39
- type: recall_at_1
value: 27.426000000000002
- type: recall_at_10
value: 58.464000000000006
- type: recall_at_100
value: 84.193
- type: recall_at_1000
value: 95.52000000000001
- type: recall_at_3
value: 42.172
- type: recall_at_5
value: 50.101
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 19.721
- type: map_at_10
value: 31.604
- type: map_at_100
value: 32.972
- type: map_at_1000
value: 33.077
- type: map_at_3
value: 27.218999999999998
- type: map_at_5
value: 29.53
- type: mrr_at_1
value: 25.0
- type: mrr_at_10
value: 35.843
- type: mrr_at_100
value: 36.785000000000004
- type: mrr_at_1000
value: 36.842000000000006
- type: mrr_at_3
value: 32.193
- type: mrr_at_5
value: 34.264
- type: ndcg_at_1
value: 25.0
- type: ndcg_at_10
value: 38.606
- type: ndcg_at_100
value: 44.272
- type: ndcg_at_1000
value: 46.527
- type: ndcg_at_3
value: 30.985000000000003
- type: ndcg_at_5
value: 34.43
- type: precision_at_1
value: 25.0
- type: precision_at_10
value: 7.811
- type: precision_at_100
value: 1.203
- type: precision_at_1000
value: 0.15
- type: precision_at_3
value: 15.423
- type: precision_at_5
value: 11.791
- type: recall_at_1
value: 19.721
- type: recall_at_10
value: 55.625
- type: recall_at_100
value: 79.34400000000001
- type: recall_at_1000
value: 95.208
- type: recall_at_3
value: 35.19
- type: recall_at_5
value: 43.626
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 33.784
- type: map_at_10
value: 47.522
- type: map_at_100
value: 48.949999999999996
- type: map_at_1000
value: 49.038
- type: map_at_3
value: 43.284
- type: map_at_5
value: 45.629
- type: mrr_at_1
value: 41.482
- type: mrr_at_10
value: 52.830999999999996
- type: mrr_at_100
value: 53.559999999999995
- type: mrr_at_1000
value: 53.588
- type: mrr_at_3
value: 50.016000000000005
- type: mrr_at_5
value: 51.614000000000004
- type: ndcg_at_1
value: 41.482
- type: ndcg_at_10
value: 54.569
- type: ndcg_at_100
value: 59.675999999999995
- type: ndcg_at_1000
value: 60.989000000000004
- type: ndcg_at_3
value: 48.187000000000005
- type: ndcg_at_5
value: 51.183
- type: precision_at_1
value: 41.482
- type: precision_at_10
value: 10.221
- type: precision_at_100
value: 1.486
- type: precision_at_1000
value: 0.17500000000000002
- type: precision_at_3
value: 23.548
- type: precision_at_5
value: 16.805
- type: recall_at_1
value: 33.784
- type: recall_at_10
value: 69.798
- type: recall_at_100
value: 90.098
- type: recall_at_1000
value: 98.176
- type: recall_at_3
value: 52.127
- type: recall_at_5
value: 59.861
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 28.038999999999998
- type: map_at_10
value: 41.904
- type: map_at_100
value: 43.36
- type: map_at_1000
value: 43.453
- type: map_at_3
value: 37.785999999999994
- type: map_at_5
value: 40.105000000000004
- type: mrr_at_1
value: 35.046
- type: mrr_at_10
value: 46.926
- type: mrr_at_100
value: 47.815000000000005
- type: mrr_at_1000
value: 47.849000000000004
- type: mrr_at_3
value: 44.273
- type: mrr_at_5
value: 45.774
- type: ndcg_at_1
value: 35.046
- type: ndcg_at_10
value: 48.937000000000005
- type: ndcg_at_100
value: 54.544000000000004
- type: ndcg_at_1000
value: 56.069
- type: ndcg_at_3
value: 42.858000000000004
- type: ndcg_at_5
value: 45.644
- type: precision_at_1
value: 35.046
- type: precision_at_10
value: 9.452
- type: precision_at_100
value: 1.429
- type: precision_at_1000
value: 0.173
- type: precision_at_3
value: 21.346999999999998
- type: precision_at_5
value: 15.342
- type: recall_at_1
value: 28.038999999999998
- type: recall_at_10
value: 64.59700000000001
- type: recall_at_100
value: 87.735
- type: recall_at_1000
value: 97.41300000000001
- type: recall_at_3
value: 47.368
- type: recall_at_5
value: 54.93900000000001
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 28.17291666666667
- type: map_at_10
value: 40.025749999999995
- type: map_at_100
value: 41.39208333333333
- type: map_at_1000
value: 41.499249999999996
- type: map_at_3
value: 36.347
- type: map_at_5
value: 38.41391666666667
- type: mrr_at_1
value: 33.65925
- type: mrr_at_10
value: 44.085499999999996
- type: mrr_at_100
value: 44.94116666666667
- type: mrr_at_1000
value: 44.9855
- type: mrr_at_3
value: 41.2815
- type: mrr_at_5
value: 42.91491666666666
- type: ndcg_at_1
value: 33.65925
- type: ndcg_at_10
value: 46.430833333333325
- type: ndcg_at_100
value: 51.761
- type: ndcg_at_1000
value: 53.50899999999999
- type: ndcg_at_3
value: 40.45133333333333
- type: ndcg_at_5
value: 43.31483333333334
- type: precision_at_1
value: 33.65925
- type: precision_at_10
value: 8.4995
- type: precision_at_100
value: 1.3210000000000004
- type: precision_at_1000
value: 0.16591666666666666
- type: precision_at_3
value: 19.165083333333335
- type: precision_at_5
value: 13.81816666666667
- type: recall_at_1
value: 28.17291666666667
- type: recall_at_10
value: 61.12624999999999
- type: recall_at_100
value: 83.97266666666667
- type: recall_at_1000
value: 95.66550000000001
- type: recall_at_3
value: 44.661249999999995
- type: recall_at_5
value: 51.983333333333334
- type: map_at_1
value: 17.936
- type: map_at_10
value: 27.399
- type: map_at_100
value: 28.632
- type: map_at_1000
value: 28.738000000000003
- type: map_at_3
value: 24.456
- type: map_at_5
value: 26.06
- type: mrr_at_1
value: 19.224
- type: mrr_at_10
value: 28.998
- type: mrr_at_100
value: 30.11
- type: mrr_at_1000
value: 30.177
- type: mrr_at_3
value: 26.247999999999998
- type: mrr_at_5
value: 27.708
- type: ndcg_at_1
value: 19.224
- type: ndcg_at_10
value: 32.911
- type: ndcg_at_100
value: 38.873999999999995
- type: ndcg_at_1000
value: 41.277
- type: ndcg_at_3
value: 27.142
- type: ndcg_at_5
value: 29.755
- type: precision_at_1
value: 19.224
- type: precision_at_10
value: 5.6930000000000005
- type: precision_at_100
value: 0.9259999999999999
- type: precision_at_1000
value: 0.126
- type: precision_at_3
value: 12.138
- type: precision_at_5
value: 8.909
- type: recall_at_1
value: 17.936
- type: recall_at_10
value: 48.096
- type: recall_at_100
value: 75.389
- type: recall_at_1000
value: 92.803
- type: recall_at_3
value: 32.812999999999995
- type: recall_at_5
value: 38.851
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 24.681
- type: map_at_10
value: 34.892
- type: map_at_100
value: 35.996
- type: map_at_1000
value: 36.083
- type: map_at_3
value: 31.491999999999997
- type: map_at_5
value: 33.632
- type: mrr_at_1
value: 28.528
- type: mrr_at_10
value: 37.694
- type: mrr_at_100
value: 38.613
- type: mrr_at_1000
value: 38.668
- type: mrr_at_3
value: 34.714
- type: mrr_at_5
value: 36.616
- type: ndcg_at_1
value: 28.528
- type: ndcg_at_10
value: 40.703
- type: ndcg_at_100
value: 45.993
- type: ndcg_at_1000
value: 47.847
- type: ndcg_at_3
value: 34.622
- type: ndcg_at_5
value: 38.035999999999994
- type: precision_at_1
value: 28.528
- type: precision_at_10
value: 6.902
- type: precision_at_100
value: 1.0370000000000001
- type: precision_at_1000
value: 0.126
- type: precision_at_3
value: 15.798000000000002
- type: precision_at_5
value: 11.655999999999999
- type: recall_at_1
value: 24.681
- type: recall_at_10
value: 55.81
- type: recall_at_100
value: 79.785
- type: recall_at_1000
value: 92.959
- type: recall_at_3
value: 39.074
- type: recall_at_5
value: 47.568
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 18.627
- type: map_at_10
value: 27.872000000000003
- type: map_at_100
value: 29.237999999999996
- type: map_at_1000
value: 29.363
- type: map_at_3
value: 24.751
- type: map_at_5
value: 26.521
- type: mrr_at_1
value: 23.021
- type: mrr_at_10
value: 31.924000000000003
- type: mrr_at_100
value: 32.922000000000004
- type: mrr_at_1000
value: 32.988
- type: mrr_at_3
value: 29.192
- type: mrr_at_5
value: 30.798
- type: ndcg_at_1
value: 23.021
- type: ndcg_at_10
value: 33.535
- type: ndcg_at_100
value: 39.732
- type: ndcg_at_1000
value: 42.201
- type: ndcg_at_3
value: 28.153
- type: ndcg_at_5
value: 30.746000000000002
- type: precision_at_1
value: 23.021
- type: precision_at_10
value: 6.459
- type: precision_at_100
value: 1.1320000000000001
- type: precision_at_1000
value: 0.153
- type: precision_at_3
value: 13.719000000000001
- type: precision_at_5
value: 10.193000000000001
- type: recall_at_1
value: 18.627
- type: recall_at_10
value: 46.463
- type: recall_at_100
value: 74.226
- type: recall_at_1000
value: 91.28500000000001
- type: recall_at_3
value: 31.357000000000003
- type: recall_at_5
value: 38.067
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 31.457
- type: map_at_10
value: 42.888
- type: map_at_100
value: 44.24
- type: map_at_1000
value: 44.327
- type: map_at_3
value: 39.588
- type: map_at_5
value: 41.423
- type: mrr_at_1
value: 37.126999999999995
- type: mrr_at_10
value: 47.083000000000006
- type: mrr_at_100
value: 47.997
- type: mrr_at_1000
value: 48.044
- type: mrr_at_3
value: 44.574000000000005
- type: mrr_at_5
value: 46.202
- type: ndcg_at_1
value: 37.126999999999995
- type: ndcg_at_10
value: 48.833
- type: ndcg_at_100
value: 54.327000000000005
- type: ndcg_at_1000
value: 56.011
- type: ndcg_at_3
value: 43.541999999999994
- type: ndcg_at_5
value: 46.127
- type: precision_at_1
value: 37.126999999999995
- type: precision_at_10
value: 8.376999999999999
- type: precision_at_100
value: 1.2309999999999999
- type: precision_at_1000
value: 0.146
- type: precision_at_3
value: 20.211000000000002
- type: precision_at_5
value: 14.16
- type: recall_at_1
value: 31.457
- type: recall_at_10
value: 62.369
- type: recall_at_100
value: 85.444
- type: recall_at_1000
value: 96.65599999999999
- type: recall_at_3
value: 47.961
- type: recall_at_5
value: 54.676
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 27.139999999999997
- type: map_at_10
value: 38.801
- type: map_at_100
value: 40.549
- type: map_at_1000
value: 40.802
- type: map_at_3
value: 35.05
- type: map_at_5
value: 36.884
- type: mrr_at_1
value: 33.004
- type: mrr_at_10
value: 43.864
- type: mrr_at_100
value: 44.667
- type: mrr_at_1000
value: 44.717
- type: mrr_at_3
value: 40.777
- type: mrr_at_5
value: 42.319
- type: ndcg_at_1
value: 33.004
- type: ndcg_at_10
value: 46.022
- type: ndcg_at_100
value: 51.542
- type: ndcg_at_1000
value: 53.742000000000004
- type: ndcg_at_3
value: 39.795
- type: ndcg_at_5
value: 42.272
- type: precision_at_1
value: 33.004
- type: precision_at_10
value: 9.012
- type: precision_at_100
value: 1.7770000000000001
- type: precision_at_1000
value: 0.26
- type: precision_at_3
value: 19.038
- type: precision_at_5
value: 13.675999999999998
- type: recall_at_1
value: 27.139999999999997
- type: recall_at_10
value: 60.961
- type: recall_at_100
value: 84.451
- type: recall_at_1000
value: 98.113
- type: recall_at_3
value: 43.001
- type: recall_at_5
value: 49.896
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: mteb/climate-fever
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 22.076999999999998
- type: map_at_10
value: 35.44
- type: map_at_100
value: 37.651
- type: map_at_1000
value: 37.824999999999996
- type: map_at_3
value: 30.764999999999997
- type: map_at_5
value: 33.26
- type: mrr_at_1
value: 50.163000000000004
- type: mrr_at_10
value: 61.207
- type: mrr_at_100
value: 61.675000000000004
- type: mrr_at_1000
value: 61.692
- type: mrr_at_3
value: 58.60999999999999
- type: mrr_at_5
value: 60.307
- type: ndcg_at_1
value: 50.163000000000004
- type: ndcg_at_10
value: 45.882
- type: ndcg_at_100
value: 53.239999999999995
- type: ndcg_at_1000
value: 55.852000000000004
- type: ndcg_at_3
value: 40.514
- type: ndcg_at_5
value: 42.038
- type: precision_at_1
value: 50.163000000000004
- type: precision_at_10
value: 13.466000000000001
- type: precision_at_100
value: 2.164
- type: precision_at_1000
value: 0.266
- type: precision_at_3
value: 29.707
- type: precision_at_5
value: 21.694
- type: recall_at_1
value: 22.076999999999998
- type: recall_at_10
value: 50.193
- type: recall_at_100
value: 74.993
- type: recall_at_1000
value: 89.131
- type: recall_at_3
value: 35.472
- type: recall_at_5
value: 41.814
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: mteb/dbpedia
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 9.953
- type: map_at_10
value: 24.515
- type: map_at_100
value: 36.173
- type: map_at_1000
value: 38.351
- type: map_at_3
value: 16.592000000000002
- type: map_at_5
value: 20.036
- type: mrr_at_1
value: 74.25
- type: mrr_at_10
value: 81.813
- type: mrr_at_100
value: 82.006
- type: mrr_at_1000
value: 82.011
- type: mrr_at_3
value: 80.875
- type: mrr_at_5
value: 81.362
- type: ndcg_at_1
value: 62.5
- type: ndcg_at_10
value: 52.42
- type: ndcg_at_100
value: 56.808
- type: ndcg_at_1000
value: 63.532999999999994
- type: ndcg_at_3
value: 56.654
- type: ndcg_at_5
value: 54.18300000000001
- type: precision_at_1
value: 74.25
- type: precision_at_10
value: 42.699999999999996
- type: precision_at_100
value: 13.675
- type: precision_at_1000
value: 2.664
- type: precision_at_3
value: 60.5
- type: precision_at_5
value: 52.800000000000004
- type: recall_at_1
value: 9.953
- type: recall_at_10
value: 30.253999999999998
- type: recall_at_100
value: 62.516000000000005
- type: recall_at_1000
value: 84.163
- type: recall_at_3
value: 18.13
- type: recall_at_5
value: 22.771
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 79.455
- type: f1
value: 74.16798697647569
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: mteb/fever
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 87.531
- type: map_at_10
value: 93.16799999999999
- type: map_at_100
value: 93.341
- type: map_at_1000
value: 93.349
- type: map_at_3
value: 92.444
- type: map_at_5
value: 92.865
- type: mrr_at_1
value: 94.014
- type: mrr_at_10
value: 96.761
- type: mrr_at_100
value: 96.762
- type: mrr_at_1000
value: 96.762
- type: mrr_at_3
value: 96.672
- type: mrr_at_5
value: 96.736
- type: ndcg_at_1
value: 94.014
- type: ndcg_at_10
value: 95.112
- type: ndcg_at_100
value: 95.578
- type: ndcg_at_1000
value: 95.68900000000001
- type: ndcg_at_3
value: 94.392
- type: ndcg_at_5
value: 94.72500000000001
- type: precision_at_1
value: 94.014
- type: precision_at_10
value: 11.065
- type: precision_at_100
value: 1.157
- type: precision_at_1000
value: 0.11800000000000001
- type: precision_at_3
value: 35.259
- type: precision_at_5
value: 21.599
- type: recall_at_1
value: 87.531
- type: recall_at_10
value: 97.356
- type: recall_at_100
value: 98.965
- type: recall_at_1000
value: 99.607
- type: recall_at_3
value: 95.312
- type: recall_at_5
value: 96.295
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: mteb/fiqa
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 32.055
- type: map_at_10
value: 53.114
- type: map_at_100
value: 55.235
- type: map_at_1000
value: 55.345
- type: map_at_3
value: 45.854
- type: map_at_5
value: 50.025
- type: mrr_at_1
value: 60.34
- type: mrr_at_10
value: 68.804
- type: mrr_at_100
value: 69.309
- type: mrr_at_1000
value: 69.32199999999999
- type: mrr_at_3
value: 66.40899999999999
- type: mrr_at_5
value: 67.976
- type: ndcg_at_1
value: 60.34
- type: ndcg_at_10
value: 62.031000000000006
- type: ndcg_at_100
value: 68.00500000000001
- type: ndcg_at_1000
value: 69.286
- type: ndcg_at_3
value: 56.355999999999995
- type: ndcg_at_5
value: 58.687
- type: precision_at_1
value: 60.34
- type: precision_at_10
value: 17.176
- type: precision_at_100
value: 2.36
- type: precision_at_1000
value: 0.259
- type: precision_at_3
value: 37.14
- type: precision_at_5
value: 27.809
- type: recall_at_1
value: 32.055
- type: recall_at_10
value: 70.91
- type: recall_at_100
value: 91.83
- type: recall_at_1000
value: 98.871
- type: recall_at_3
value: 51.202999999999996
- type: recall_at_5
value: 60.563
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: mteb/hotpotqa
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 43.68
- type: map_at_10
value: 64.389
- type: map_at_100
value: 65.24
- type: map_at_1000
value: 65.303
- type: map_at_3
value: 61.309000000000005
- type: map_at_5
value: 63.275999999999996
- type: mrr_at_1
value: 87.36
- type: mrr_at_10
value: 91.12
- type: mrr_at_100
value: 91.227
- type: mrr_at_1000
value: 91.229
- type: mrr_at_3
value: 90.57600000000001
- type: mrr_at_5
value: 90.912
- type: ndcg_at_1
value: 87.36
- type: ndcg_at_10
value: 73.076
- type: ndcg_at_100
value: 75.895
- type: ndcg_at_1000
value: 77.049
- type: ndcg_at_3
value: 68.929
- type: ndcg_at_5
value: 71.28
- type: precision_at_1
value: 87.36
- type: precision_at_10
value: 14.741000000000001
- type: precision_at_100
value: 1.694
- type: precision_at_1000
value: 0.185
- type: precision_at_3
value: 43.043
- type: precision_at_5
value: 27.681
- type: recall_at_1
value: 43.68
- type: recall_at_10
value: 73.707
- type: recall_at_100
value: 84.7
- type: recall_at_1000
value: 92.309
- type: recall_at_3
value: 64.564
- type: recall_at_5
value: 69.203
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 96.75399999999999
- type: ap
value: 95.29389839242187
- type: f1
value: 96.75348377433475
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: mteb/msmarco
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: map_at_1
value: 25.176
- type: map_at_10
value: 38.598
- type: map_at_100
value: 39.707
- type: map_at_1000
value: 39.744
- type: map_at_3
value: 34.566
- type: map_at_5
value: 36.863
- type: mrr_at_1
value: 25.874000000000002
- type: mrr_at_10
value: 39.214
- type: mrr_at_100
value: 40.251
- type: mrr_at_1000
value: 40.281
- type: mrr_at_3
value: 35.291
- type: mrr_at_5
value: 37.545
- type: ndcg_at_1
value: 25.874000000000002
- type: ndcg_at_10
value: 45.98
- type: ndcg_at_100
value: 51.197
- type: ndcg_at_1000
value: 52.073
- type: ndcg_at_3
value: 37.785999999999994
- type: ndcg_at_5
value: 41.870000000000005
- type: precision_at_1
value: 25.874000000000002
- type: precision_at_10
value: 7.181
- type: precision_at_100
value: 0.979
- type: precision_at_1000
value: 0.106
- type: precision_at_3
value: 16.051000000000002
- type: precision_at_5
value: 11.713
- type: recall_at_1
value: 25.176
- type: recall_at_10
value: 68.67699999999999
- type: recall_at_100
value: 92.55
- type: recall_at_1000
value: 99.164
- type: recall_at_3
value: 46.372
- type: recall_at_5
value: 56.16
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 99.03784769721841
- type: f1
value: 98.97791641821495
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 91.88326493388054
- type: f1
value: 73.74809928034335
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 85.41358439811701
- type: f1
value: 83.503679460639
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 89.77135171486215
- type: f1
value: 88.89843747468366
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 46.22695362087359
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 44.132372165849425
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 33.35680810650402
- type: mrr
value: 34.72625715637218
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: mteb/nfcorpus
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 7.165000000000001
- type: map_at_10
value: 15.424
- type: map_at_100
value: 20.28
- type: map_at_1000
value: 22.065
- type: map_at_3
value: 11.236
- type: map_at_5
value: 13.025999999999998
- type: mrr_at_1
value: 51.702999999999996
- type: mrr_at_10
value: 59.965
- type: mrr_at_100
value: 60.667
- type: mrr_at_1000
value: 60.702999999999996
- type: mrr_at_3
value: 58.772000000000006
- type: mrr_at_5
value: 59.267
- type: ndcg_at_1
value: 49.536
- type: ndcg_at_10
value: 40.6
- type: ndcg_at_100
value: 37.848
- type: ndcg_at_1000
value: 46.657
- type: ndcg_at_3
value: 46.117999999999995
- type: ndcg_at_5
value: 43.619
- type: precision_at_1
value: 51.393
- type: precision_at_10
value: 30.31
- type: precision_at_100
value: 9.972
- type: precision_at_1000
value: 2.329
- type: precision_at_3
value: 43.137
- type: precision_at_5
value: 37.585
- type: recall_at_1
value: 7.165000000000001
- type: recall_at_10
value: 19.689999999999998
- type: recall_at_100
value: 39.237
- type: recall_at_1000
value: 71.417
- type: recall_at_3
value: 12.247
- type: recall_at_5
value: 14.902999999999999
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: mteb/nq
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 42.653999999999996
- type: map_at_10
value: 59.611999999999995
- type: map_at_100
value: 60.32300000000001
- type: map_at_1000
value: 60.336
- type: map_at_3
value: 55.584999999999994
- type: map_at_5
value: 58.19
- type: mrr_at_1
value: 47.683
- type: mrr_at_10
value: 62.06700000000001
- type: mrr_at_100
value: 62.537
- type: mrr_at_1000
value: 62.544999999999995
- type: mrr_at_3
value: 59.178
- type: mrr_at_5
value: 61.034
- type: ndcg_at_1
value: 47.654
- type: ndcg_at_10
value: 67.001
- type: ndcg_at_100
value: 69.73899999999999
- type: ndcg_at_1000
value: 69.986
- type: ndcg_at_3
value: 59.95700000000001
- type: ndcg_at_5
value: 64.025
- type: precision_at_1
value: 47.654
- type: precision_at_10
value: 10.367999999999999
- type: precision_at_100
value: 1.192
- type: precision_at_1000
value: 0.121
- type: precision_at_3
value: 26.651000000000003
- type: precision_at_5
value: 18.459
- type: recall_at_1
value: 42.653999999999996
- type: recall_at_10
value: 86.619
- type: recall_at_100
value: 98.04899999999999
- type: recall_at_1000
value: 99.812
- type: recall_at_3
value: 68.987
- type: recall_at_5
value: 78.158
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: mteb/quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 72.538
- type: map_at_10
value: 86.702
- type: map_at_100
value: 87.31
- type: map_at_1000
value: 87.323
- type: map_at_3
value: 83.87
- type: map_at_5
value: 85.682
- type: mrr_at_1
value: 83.31
- type: mrr_at_10
value: 89.225
- type: mrr_at_100
value: 89.30399999999999
- type: mrr_at_1000
value: 89.30399999999999
- type: mrr_at_3
value: 88.44300000000001
- type: mrr_at_5
value: 89.005
- type: ndcg_at_1
value: 83.32000000000001
- type: ndcg_at_10
value: 90.095
- type: ndcg_at_100
value: 91.12
- type: ndcg_at_1000
value: 91.179
- type: ndcg_at_3
value: 87.606
- type: ndcg_at_5
value: 89.031
- type: precision_at_1
value: 83.32000000000001
- type: precision_at_10
value: 13.641
- type: precision_at_100
value: 1.541
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 38.377
- type: precision_at_5
value: 25.162000000000003
- type: recall_at_1
value: 72.538
- type: recall_at_10
value: 96.47200000000001
- type: recall_at_100
value: 99.785
- type: recall_at_1000
value: 99.99900000000001
- type: recall_at_3
value: 89.278
- type: recall_at_5
value: 93.367
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 73.55219145406065
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 74.13437105242755
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: mteb/scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 6.873
- type: map_at_10
value: 17.944
- type: map_at_100
value: 21.171
- type: map_at_1000
value: 21.528
- type: map_at_3
value: 12.415
- type: map_at_5
value: 15.187999999999999
- type: mrr_at_1
value: 33.800000000000004
- type: mrr_at_10
value: 46.455
- type: mrr_at_100
value: 47.378
- type: mrr_at_1000
value: 47.394999999999996
- type: mrr_at_3
value: 42.367
- type: mrr_at_5
value: 44.972
- type: ndcg_at_1
value: 33.800000000000004
- type: ndcg_at_10
value: 28.907
- type: ndcg_at_100
value: 39.695
- type: ndcg_at_1000
value: 44.582
- type: ndcg_at_3
value: 26.949
- type: ndcg_at_5
value: 23.988
- type: precision_at_1
value: 33.800000000000004
- type: precision_at_10
value: 15.079999999999998
- type: precision_at_100
value: 3.056
- type: precision_at_1000
value: 0.42100000000000004
- type: precision_at_3
value: 25.167
- type: precision_at_5
value: 21.26
- type: recall_at_1
value: 6.873
- type: recall_at_10
value: 30.568
- type: recall_at_100
value: 62.062
- type: recall_at_1000
value: 85.37700000000001
- type: recall_at_3
value: 15.312999999999999
- type: recall_at_5
value: 21.575
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 82.37009118256057
- type: cos_sim_spearman
value: 79.27986395671529
- type: euclidean_pearson
value: 79.18037715442115
- type: euclidean_spearman
value: 79.28004791561621
- type: manhattan_pearson
value: 79.34062972800541
- type: manhattan_spearman
value: 79.43106695543402
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 87.48474767383833
- type: cos_sim_spearman
value: 79.54505388752513
- type: euclidean_pearson
value: 83.43282704179565
- type: euclidean_spearman
value: 79.54579919925405
- type: manhattan_pearson
value: 83.77564492427952
- type: manhattan_spearman
value: 79.84558396989286
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 88.803698035802
- type: cos_sim_spearman
value: 88.83451367754881
- type: euclidean_pearson
value: 88.28939285711628
- type: euclidean_spearman
value: 88.83528996073112
- type: manhattan_pearson
value: 88.28017412671795
- type: manhattan_spearman
value: 88.9228828016344
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 85.27469288153428
- type: cos_sim_spearman
value: 83.87477064876288
- type: euclidean_pearson
value: 84.2601737035379
- type: euclidean_spearman
value: 83.87431082479074
- type: manhattan_pearson
value: 84.3621547772745
- type: manhattan_spearman
value: 84.12094375000423
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 88.12749863201587
- type: cos_sim_spearman
value: 88.54287568368565
- type: euclidean_pearson
value: 87.90429700607999
- type: euclidean_spearman
value: 88.5437689576261
- type: manhattan_pearson
value: 88.19276653356833
- type: manhattan_spearman
value: 88.99995393814679
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 85.68398747560902
- type: cos_sim_spearman
value: 86.48815303460574
- type: euclidean_pearson
value: 85.52356631237954
- type: euclidean_spearman
value: 86.486391949551
- type: manhattan_pearson
value: 85.67267981761788
- type: manhattan_spearman
value: 86.7073696332485
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 88.9057107443124
- type: cos_sim_spearman
value: 88.7312168757697
- type: euclidean_pearson
value: 88.72810439714794
- type: euclidean_spearman
value: 88.71976185854771
- type: manhattan_pearson
value: 88.50433745949111
- type: manhattan_spearman
value: 88.51726175544195
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 67.59391795109886
- type: cos_sim_spearman
value: 66.87613008631367
- type: euclidean_pearson
value: 69.23198488262217
- type: euclidean_spearman
value: 66.85427723013692
- type: manhattan_pearson
value: 69.50730124841084
- type: manhattan_spearman
value: 67.10404669820792
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 87.0820605344619
- type: cos_sim_spearman
value: 86.8518089863434
- type: euclidean_pearson
value: 86.31087134689284
- type: euclidean_spearman
value: 86.8518520517941
- type: manhattan_pearson
value: 86.47203796160612
- type: manhattan_spearman
value: 87.1080149734421
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 89.09255369305481
- type: mrr
value: 97.10323445617563
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: mteb/scifact
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 61.260999999999996
- type: map_at_10
value: 74.043
- type: map_at_100
value: 74.37700000000001
- type: map_at_1000
value: 74.384
- type: map_at_3
value: 71.222
- type: map_at_5
value: 72.875
- type: mrr_at_1
value: 64.333
- type: mrr_at_10
value: 74.984
- type: mrr_at_100
value: 75.247
- type: mrr_at_1000
value: 75.25500000000001
- type: mrr_at_3
value: 73.167
- type: mrr_at_5
value: 74.35000000000001
- type: ndcg_at_1
value: 64.333
- type: ndcg_at_10
value: 79.06
- type: ndcg_at_100
value: 80.416
- type: ndcg_at_1000
value: 80.55600000000001
- type: ndcg_at_3
value: 74.753
- type: ndcg_at_5
value: 76.97500000000001
- type: precision_at_1
value: 64.333
- type: precision_at_10
value: 10.567
- type: precision_at_100
value: 1.1199999999999999
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 29.889
- type: precision_at_5
value: 19.533
- type: recall_at_1
value: 61.260999999999996
- type: recall_at_10
value: 93.167
- type: recall_at_100
value: 99.0
- type: recall_at_1000
value: 100.0
- type: recall_at_3
value: 81.667
- type: recall_at_5
value: 87.394
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.71980198019801
- type: cos_sim_ap
value: 92.81616007802704
- type: cos_sim_f1
value: 85.17548454688318
- type: cos_sim_precision
value: 89.43894389438944
- type: cos_sim_recall
value: 81.3
- type: dot_accuracy
value: 99.71980198019801
- type: dot_ap
value: 92.81398760591358
- type: dot_f1
value: 85.17548454688318
- type: dot_precision
value: 89.43894389438944
- type: dot_recall
value: 81.3
- type: euclidean_accuracy
value: 99.71980198019801
- type: euclidean_ap
value: 92.81560637245072
- type: euclidean_f1
value: 85.17548454688318
- type: euclidean_precision
value: 89.43894389438944
- type: euclidean_recall
value: 81.3
- type: manhattan_accuracy
value: 99.73069306930694
- type: manhattan_ap
value: 93.14005487480794
- type: manhattan_f1
value: 85.56263269639068
- type: manhattan_precision
value: 91.17647058823529
- type: manhattan_recall
value: 80.60000000000001
- type: max_accuracy
value: 99.73069306930694
- type: max_ap
value: 93.14005487480794
- type: max_f1
value: 85.56263269639068
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 79.86443362395185
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 49.40897096662564
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 55.66040806627947
- type: mrr
value: 56.58670475766064
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 31.51015090598575
- type: cos_sim_spearman
value: 31.35016454939226
- type: dot_pearson
value: 31.5150068731
- type: dot_spearman
value: 31.34790869023487
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: mteb/trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.254
- type: map_at_10
value: 2.064
- type: map_at_100
value: 12.909
- type: map_at_1000
value: 31.761
- type: map_at_3
value: 0.738
- type: map_at_5
value: 1.155
- type: mrr_at_1
value: 96.0
- type: mrr_at_10
value: 98.0
- type: mrr_at_100
value: 98.0
- type: mrr_at_1000
value: 98.0
- type: mrr_at_3
value: 98.0
- type: mrr_at_5
value: 98.0
- type: ndcg_at_1
value: 93.0
- type: ndcg_at_10
value: 82.258
- type: ndcg_at_100
value: 64.34
- type: ndcg_at_1000
value: 57.912
- type: ndcg_at_3
value: 90.827
- type: ndcg_at_5
value: 86.79
- type: precision_at_1
value: 96.0
- type: precision_at_10
value: 84.8
- type: precision_at_100
value: 66.0
- type: precision_at_1000
value: 25.356
- type: precision_at_3
value: 94.667
- type: precision_at_5
value: 90.4
- type: recall_at_1
value: 0.254
- type: recall_at_10
value: 2.1950000000000003
- type: recall_at_100
value: 16.088
- type: recall_at_1000
value: 54.559000000000005
- type: recall_at_3
value: 0.75
- type: recall_at_5
value: 1.191
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: mteb/touche2020
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 2.976
- type: map_at_10
value: 11.389000000000001
- type: map_at_100
value: 18.429000000000002
- type: map_at_1000
value: 20.113
- type: map_at_3
value: 6.483
- type: map_at_5
value: 8.770999999999999
- type: mrr_at_1
value: 40.816
- type: mrr_at_10
value: 58.118
- type: mrr_at_100
value: 58.489999999999995
- type: mrr_at_1000
value: 58.489999999999995
- type: mrr_at_3
value: 53.061
- type: mrr_at_5
value: 57.041
- type: ndcg_at_1
value: 40.816
- type: ndcg_at_10
value: 30.567
- type: ndcg_at_100
value: 42.44
- type: ndcg_at_1000
value: 53.480000000000004
- type: ndcg_at_3
value: 36.016
- type: ndcg_at_5
value: 34.257
- type: precision_at_1
value: 42.857
- type: precision_at_10
value: 25.714
- type: precision_at_100
value: 8.429
- type: precision_at_1000
value: 1.5939999999999999
- type: precision_at_3
value: 36.735
- type: precision_at_5
value: 33.878
- type: recall_at_1
value: 2.976
- type: recall_at_10
value: 17.854999999999997
- type: recall_at_100
value: 51.833
- type: recall_at_1000
value: 86.223
- type: recall_at_3
value: 7.887
- type: recall_at_5
value: 12.026
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 85.1174
- type: ap
value: 30.169441069345748
- type: f1
value: 69.79254701873245
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 72.58347481607245
- type: f1
value: 72.74877295564937
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 53.90586138221305
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 87.35769207844072
- type: cos_sim_ap
value: 77.9645072410354
- type: cos_sim_f1
value: 71.32352941176471
- type: cos_sim_precision
value: 66.5903890160183
- type: cos_sim_recall
value: 76.78100263852242
- type: dot_accuracy
value: 87.37557370209214
- type: dot_ap
value: 77.96250046429908
- type: dot_f1
value: 71.28932757557064
- type: dot_precision
value: 66.95249130938586
- type: dot_recall
value: 76.22691292875989
- type: euclidean_accuracy
value: 87.35173153722357
- type: euclidean_ap
value: 77.96520460741593
- type: euclidean_f1
value: 71.32470733210104
- type: euclidean_precision
value: 66.91329479768785
- type: euclidean_recall
value: 76.35883905013192
- type: manhattan_accuracy
value: 87.25636287774931
- type: manhattan_ap
value: 77.77752485611796
- type: manhattan_f1
value: 71.18148599269183
- type: manhattan_precision
value: 66.10859728506787
- type: manhattan_recall
value: 77.0976253298153
- type: max_accuracy
value: 87.37557370209214
- type: max_ap
value: 77.96520460741593
- type: max_f1
value: 71.32470733210104
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.38176737687739
- type: cos_sim_ap
value: 86.58811861657401
- type: cos_sim_f1
value: 79.09430644097604
- type: cos_sim_precision
value: 75.45085977911366
- type: cos_sim_recall
value: 83.10748383122882
- type: dot_accuracy
value: 89.38370784336554
- type: dot_ap
value: 86.58840606004333
- type: dot_f1
value: 79.10179860068133
- type: dot_precision
value: 75.44546153308643
- type: dot_recall
value: 83.13058207576223
- type: euclidean_accuracy
value: 89.38564830985369
- type: euclidean_ap
value: 86.58820721061164
- type: euclidean_f1
value: 79.09070942235888
- type: euclidean_precision
value: 75.38729937194697
- type: euclidean_recall
value: 83.17677856482906
- type: manhattan_accuracy
value: 89.40699344122326
- type: manhattan_ap
value: 86.60631843011362
- type: manhattan_f1
value: 79.14949970570925
- type: manhattan_precision
value: 75.78191039729502
- type: manhattan_recall
value: 82.83030489682784
- type: max_accuracy
value: 89.40699344122326
- type: max_ap
value: 86.60631843011362
- type: max_f1
value: 79.14949970570925
- task:
type: STS
dataset:
name: MTEB AFQMC
type: C-MTEB/AFQMC
config: default
split: validation
revision: b44c3b011063adb25877c13823db83bb193913c4
metrics:
- type: cos_sim_pearson
value: 65.58442135663871
- type: cos_sim_spearman
value: 72.2538631361313
- type: euclidean_pearson
value: 70.97255486607429
- type: euclidean_spearman
value: 72.25374250228647
- type: manhattan_pearson
value: 70.83250199989911
- type: manhattan_spearman
value: 72.14819496536272
- task:
type: STS
dataset:
name: MTEB ATEC
type: C-MTEB/ATEC
config: default
split: test
revision: 0f319b1142f28d00e055a6770f3f726ae9b7d865
metrics:
- type: cos_sim_pearson
value: 59.99478404929932
- type: cos_sim_spearman
value: 62.61836216999812
- type: euclidean_pearson
value: 66.86429811933593
- type: euclidean_spearman
value: 62.6183520374191
- type: manhattan_pearson
value: 66.8063778911633
- type: manhattan_spearman
value: 62.569607573241115
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (zh)
type: mteb/amazon_reviews_multi
config: zh
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 53.98400000000001
- type: f1
value: 51.21447361350723
- task:
type: STS
dataset:
name: MTEB BQ
type: C-MTEB/BQ
config: default
split: test
revision: e3dda5e115e487b39ec7e618c0c6a29137052a55
metrics:
- type: cos_sim_pearson
value: 79.11941660686553
- type: cos_sim_spearman
value: 81.25029594540435
- type: euclidean_pearson
value: 82.06973504238826
- type: euclidean_spearman
value: 81.2501989488524
- type: manhattan_pearson
value: 82.10094630392753
- type: manhattan_spearman
value: 81.27987244392389
- task:
type: Clustering
dataset:
name: MTEB CLSClusteringP2P
type: C-MTEB/CLSClusteringP2P
config: default
split: test
revision: 4b6227591c6c1a73bc76b1055f3b7f3588e72476
metrics:
- type: v_measure
value: 47.07270168705156
- task:
type: Clustering
dataset:
name: MTEB CLSClusteringS2S
type: C-MTEB/CLSClusteringS2S
config: default
split: test
revision: e458b3f5414b62b7f9f83499ac1f5497ae2e869f
metrics:
- type: v_measure
value: 45.98511703185043
- task:
type: Reranking
dataset:
name: MTEB CMedQAv1
type: C-MTEB/CMedQAv1-reranking
config: default
split: test
revision: 8d7f1e942507dac42dc58017c1a001c3717da7df
metrics:
- type: map
value: 88.19895157194931
- type: mrr
value: 90.21424603174603
- task:
type: Reranking
dataset:
name: MTEB CMedQAv2
type: C-MTEB/CMedQAv2-reranking
config: default
split: test
revision: 23d186750531a14a0357ca22cd92d712fd512ea0
metrics:
- type: map
value: 88.03317320980119
- type: mrr
value: 89.9461507936508
- task:
type: Retrieval
dataset:
name: MTEB CmedqaRetrieval
type: C-MTEB/CmedqaRetrieval
config: default
split: dev
revision: cd540c506dae1cf9e9a59c3e06f42030d54e7301
metrics:
- type: map_at_1
value: 29.037000000000003
- type: map_at_10
value: 42.001
- type: map_at_100
value: 43.773
- type: map_at_1000
value: 43.878
- type: map_at_3
value: 37.637
- type: map_at_5
value: 40.034
- type: mrr_at_1
value: 43.136
- type: mrr_at_10
value: 51.158
- type: mrr_at_100
value: 52.083
- type: mrr_at_1000
value: 52.12
- type: mrr_at_3
value: 48.733
- type: mrr_at_5
value: 50.025
- type: ndcg_at_1
value: 43.136
- type: ndcg_at_10
value: 48.685
- type: ndcg_at_100
value: 55.513
- type: ndcg_at_1000
value: 57.242000000000004
- type: ndcg_at_3
value: 43.329
- type: ndcg_at_5
value: 45.438
- type: precision_at_1
value: 43.136
- type: precision_at_10
value: 10.56
- type: precision_at_100
value: 1.6129999999999998
- type: precision_at_1000
value: 0.184
- type: precision_at_3
value: 24.064
- type: precision_at_5
value: 17.269000000000002
- type: recall_at_1
value: 29.037000000000003
- type: recall_at_10
value: 59.245000000000005
- type: recall_at_100
value: 87.355
- type: recall_at_1000
value: 98.74000000000001
- type: recall_at_3
value: 42.99
- type: recall_at_5
value: 49.681999999999995
- task:
type: PairClassification
dataset:
name: MTEB Cmnli
type: C-MTEB/CMNLI
config: default
split: validation
revision: 41bc36f332156f7adc9e38f53777c959b2ae9766
metrics:
- type: cos_sim_accuracy
value: 82.68190018039687
- type: cos_sim_ap
value: 90.18017125327886
- type: cos_sim_f1
value: 83.64080906868193
- type: cos_sim_precision
value: 79.7076890489303
- type: cos_sim_recall
value: 87.98223053542202
- type: dot_accuracy
value: 82.68190018039687
- type: dot_ap
value: 90.18782350103646
- type: dot_f1
value: 83.64242087729039
- type: dot_precision
value: 79.65313028764805
- type: dot_recall
value: 88.05237315875614
- type: euclidean_accuracy
value: 82.68190018039687
- type: euclidean_ap
value: 90.1801957900632
- type: euclidean_f1
value: 83.63636363636364
- type: euclidean_precision
value: 79.52772506852203
- type: euclidean_recall
value: 88.19265840542437
- type: manhattan_accuracy
value: 82.14070956103427
- type: manhattan_ap
value: 89.96178420101427
- type: manhattan_f1
value: 83.21087838578791
- type: manhattan_precision
value: 78.35605121850475
- type: manhattan_recall
value: 88.70703764320785
- type: max_accuracy
value: 82.68190018039687
- type: max_ap
value: 90.18782350103646
- type: max_f1
value: 83.64242087729039
- task:
type: Retrieval
dataset:
name: MTEB CovidRetrieval
type: C-MTEB/CovidRetrieval
config: default
split: dev
revision: 1271c7809071a13532e05f25fb53511ffce77117
metrics:
- type: map_at_1
value: 72.234
- type: map_at_10
value: 80.10000000000001
- type: map_at_100
value: 80.36
- type: map_at_1000
value: 80.363
- type: map_at_3
value: 78.315
- type: map_at_5
value: 79.607
- type: mrr_at_1
value: 72.392
- type: mrr_at_10
value: 80.117
- type: mrr_at_100
value: 80.36999999999999
- type: mrr_at_1000
value: 80.373
- type: mrr_at_3
value: 78.469
- type: mrr_at_5
value: 79.633
- type: ndcg_at_1
value: 72.392
- type: ndcg_at_10
value: 83.651
- type: ndcg_at_100
value: 84.749
- type: ndcg_at_1000
value: 84.83000000000001
- type: ndcg_at_3
value: 80.253
- type: ndcg_at_5
value: 82.485
- type: precision_at_1
value: 72.392
- type: precision_at_10
value: 9.557
- type: precision_at_100
value: 1.004
- type: precision_at_1000
value: 0.101
- type: precision_at_3
value: 28.732000000000003
- type: precision_at_5
value: 18.377
- type: recall_at_1
value: 72.234
- type: recall_at_10
value: 94.573
- type: recall_at_100
value: 99.368
- type: recall_at_1000
value: 100.0
- type: recall_at_3
value: 85.669
- type: recall_at_5
value: 91.01700000000001
- task:
type: Retrieval
dataset:
name: MTEB DuRetrieval
type: C-MTEB/DuRetrieval
config: default
split: dev
revision: a1a333e290fe30b10f3f56498e3a0d911a693ced
metrics:
- type: map_at_1
value: 26.173999999999996
- type: map_at_10
value: 80.04
- type: map_at_100
value: 82.94500000000001
- type: map_at_1000
value: 82.98100000000001
- type: map_at_3
value: 55.562999999999995
- type: map_at_5
value: 69.89800000000001
- type: mrr_at_1
value: 89.5
- type: mrr_at_10
value: 92.996
- type: mrr_at_100
value: 93.06400000000001
- type: mrr_at_1000
value: 93.065
- type: mrr_at_3
value: 92.658
- type: mrr_at_5
value: 92.84599999999999
- type: ndcg_at_1
value: 89.5
- type: ndcg_at_10
value: 87.443
- type: ndcg_at_100
value: 90.253
- type: ndcg_at_1000
value: 90.549
- type: ndcg_at_3
value: 85.874
- type: ndcg_at_5
value: 84.842
- type: precision_at_1
value: 89.5
- type: precision_at_10
value: 41.805
- type: precision_at_100
value: 4.827
- type: precision_at_1000
value: 0.49
- type: precision_at_3
value: 76.85
- type: precision_at_5
value: 64.8
- type: recall_at_1
value: 26.173999999999996
- type: recall_at_10
value: 89.101
- type: recall_at_100
value: 98.08099999999999
- type: recall_at_1000
value: 99.529
- type: recall_at_3
value: 57.902
- type: recall_at_5
value: 74.602
- task:
type: Retrieval
dataset:
name: MTEB EcomRetrieval
type: C-MTEB/EcomRetrieval
config: default
split: dev
revision: 687de13dc7294d6fd9be10c6945f9e8fec8166b9
metrics:
- type: map_at_1
value: 56.10000000000001
- type: map_at_10
value: 66.15299999999999
- type: map_at_100
value: 66.625
- type: map_at_1000
value: 66.636
- type: map_at_3
value: 63.632999999999996
- type: map_at_5
value: 65.293
- type: mrr_at_1
value: 56.10000000000001
- type: mrr_at_10
value: 66.15299999999999
- type: mrr_at_100
value: 66.625
- type: mrr_at_1000
value: 66.636
- type: mrr_at_3
value: 63.632999999999996
- type: mrr_at_5
value: 65.293
- type: ndcg_at_1
value: 56.10000000000001
- type: ndcg_at_10
value: 71.146
- type: ndcg_at_100
value: 73.27799999999999
- type: ndcg_at_1000
value: 73.529
- type: ndcg_at_3
value: 66.09
- type: ndcg_at_5
value: 69.08999999999999
- type: precision_at_1
value: 56.10000000000001
- type: precision_at_10
value: 8.68
- type: precision_at_100
value: 0.964
- type: precision_at_1000
value: 0.098
- type: precision_at_3
value: 24.4
- type: precision_at_5
value: 16.1
- type: recall_at_1
value: 56.10000000000001
- type: recall_at_10
value: 86.8
- type: recall_at_100
value: 96.39999999999999
- type: recall_at_1000
value: 98.3
- type: recall_at_3
value: 73.2
- type: recall_at_5
value: 80.5
- task:
type: Classification
dataset:
name: MTEB IFlyTek
type: C-MTEB/IFlyTek-classification
config: default
split: validation
revision: 421605374b29664c5fc098418fe20ada9bd55f8a
metrics:
- type: accuracy
value: 54.52096960369373
- type: f1
value: 40.930845295808695
- task:
type: Classification
dataset:
name: MTEB JDReview
type: C-MTEB/JDReview-classification
config: default
split: test
revision: b7c64bd89eb87f8ded463478346f76731f07bf8b
metrics:
- type: accuracy
value: 86.51031894934334
- type: ap
value: 55.9516014323483
- type: f1
value: 81.54813679326381
- task:
type: STS
dataset:
name: MTEB LCQMC
type: C-MTEB/LCQMC
config: default
split: test
revision: 17f9b096f80380fce5ed12a9be8be7784b337daf
metrics:
- type: cos_sim_pearson
value: 69.67437838574276
- type: cos_sim_spearman
value: 73.81314174653045
- type: euclidean_pearson
value: 72.63430276680275
- type: euclidean_spearman
value: 73.81358736777001
- type: manhattan_pearson
value: 72.58743833842829
- type: manhattan_spearman
value: 73.7590419009179
- task:
type: Reranking
dataset:
name: MTEB MMarcoReranking
type: C-MTEB/Mmarco-reranking
config: default
split: dev
revision: None
metrics:
- type: map
value: 31.648613483640254
- type: mrr
value: 30.37420634920635
- task:
type: Retrieval
dataset:
name: MTEB MMarcoRetrieval
type: C-MTEB/MMarcoRetrieval
config: default
split: dev
revision: 539bbde593d947e2a124ba72651aafc09eb33fc2
metrics:
- type: map_at_1
value: 73.28099999999999
- type: map_at_10
value: 81.977
- type: map_at_100
value: 82.222
- type: map_at_1000
value: 82.22699999999999
- type: map_at_3
value: 80.441
- type: map_at_5
value: 81.46600000000001
- type: mrr_at_1
value: 75.673
- type: mrr_at_10
value: 82.41000000000001
- type: mrr_at_100
value: 82.616
- type: mrr_at_1000
value: 82.621
- type: mrr_at_3
value: 81.094
- type: mrr_at_5
value: 81.962
- type: ndcg_at_1
value: 75.673
- type: ndcg_at_10
value: 85.15599999999999
- type: ndcg_at_100
value: 86.151
- type: ndcg_at_1000
value: 86.26899999999999
- type: ndcg_at_3
value: 82.304
- type: ndcg_at_5
value: 84.009
- type: precision_at_1
value: 75.673
- type: precision_at_10
value: 10.042
- type: precision_at_100
value: 1.052
- type: precision_at_1000
value: 0.106
- type: precision_at_3
value: 30.673000000000002
- type: precision_at_5
value: 19.326999999999998
- type: recall_at_1
value: 73.28099999999999
- type: recall_at_10
value: 94.446
- type: recall_at_100
value: 98.737
- type: recall_at_1000
value: 99.649
- type: recall_at_3
value: 86.984
- type: recall_at_5
value: 91.024
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (zh-CN)
type: mteb/amazon_massive_intent
config: zh-CN
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 81.08607935440484
- type: f1
value: 78.24879986066307
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (zh-CN)
type: mteb/amazon_massive_scenario
config: zh-CN
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 86.05917955615332
- type: f1
value: 85.05279279434997
- task:
type: Retrieval
dataset:
name: MTEB MedicalRetrieval
type: C-MTEB/MedicalRetrieval
config: default
split: dev
revision: 2039188fb5800a9803ba5048df7b76e6fb151fc6
metrics:
- type: map_at_1
value: 56.2
- type: map_at_10
value: 62.57899999999999
- type: map_at_100
value: 63.154999999999994
- type: map_at_1000
value: 63.193
- type: map_at_3
value: 61.217
- type: map_at_5
value: 62.012
- type: mrr_at_1
value: 56.3
- type: mrr_at_10
value: 62.629000000000005
- type: mrr_at_100
value: 63.205999999999996
- type: mrr_at_1000
value: 63.244
- type: mrr_at_3
value: 61.267
- type: mrr_at_5
value: 62.062
- type: ndcg_at_1
value: 56.2
- type: ndcg_at_10
value: 65.592
- type: ndcg_at_100
value: 68.657
- type: ndcg_at_1000
value: 69.671
- type: ndcg_at_3
value: 62.808
- type: ndcg_at_5
value: 64.24499999999999
- type: precision_at_1
value: 56.2
- type: precision_at_10
value: 7.5
- type: precision_at_100
value: 0.899
- type: precision_at_1000
value: 0.098
- type: precision_at_3
value: 22.467000000000002
- type: precision_at_5
value: 14.180000000000001
- type: recall_at_1
value: 56.2
- type: recall_at_10
value: 75.0
- type: recall_at_100
value: 89.9
- type: recall_at_1000
value: 97.89999999999999
- type: recall_at_3
value: 67.4
- type: recall_at_5
value: 70.89999999999999
- task:
type: Classification
dataset:
name: MTEB MultilingualSentiment
type: C-MTEB/MultilingualSentiment-classification
config: default
split: validation
revision: 46958b007a63fdbf239b7672c25d0bea67b5ea1a
metrics:
- type: accuracy
value: 76.87666666666667
- type: f1
value: 76.7317686219665
- task:
type: PairClassification
dataset:
name: MTEB Ocnli
type: C-MTEB/OCNLI
config: default
split: validation
revision: 66e76a618a34d6d565d5538088562851e6daa7ec
metrics:
- type: cos_sim_accuracy
value: 79.64266377910124
- type: cos_sim_ap
value: 84.78274442344829
- type: cos_sim_f1
value: 81.16947472745292
- type: cos_sim_precision
value: 76.47058823529412
- type: cos_sim_recall
value: 86.48363252375924
- type: dot_accuracy
value: 79.64266377910124
- type: dot_ap
value: 84.7851404063692
- type: dot_f1
value: 81.16947472745292
- type: dot_precision
value: 76.47058823529412
- type: dot_recall
value: 86.48363252375924
- type: euclidean_accuracy
value: 79.64266377910124
- type: euclidean_ap
value: 84.78068373762378
- type: euclidean_f1
value: 81.14794656110837
- type: euclidean_precision
value: 76.35009310986965
- type: euclidean_recall
value: 86.58922914466737
- type: manhattan_accuracy
value: 79.48023822414727
- type: manhattan_ap
value: 84.72928897427576
- type: manhattan_f1
value: 81.32084770823064
- type: manhattan_precision
value: 76.24768946395564
- type: manhattan_recall
value: 87.11721224920802
- type: max_accuracy
value: 79.64266377910124
- type: max_ap
value: 84.7851404063692
- type: max_f1
value: 81.32084770823064
- task:
type: Classification
dataset:
name: MTEB OnlineShopping
type: C-MTEB/OnlineShopping-classification
config: default
split: test
revision: e610f2ebd179a8fda30ae534c3878750a96db120
metrics:
- type: accuracy
value: 94.3
- type: ap
value: 92.8664032274438
- type: f1
value: 94.29311102997727
- task:
type: STS
dataset:
name: MTEB PAWSX
type: C-MTEB/PAWSX
config: default
split: test
revision: 9c6a90e430ac22b5779fb019a23e820b11a8b5e1
metrics:
- type: cos_sim_pearson
value: 48.51392279882909
- type: cos_sim_spearman
value: 54.06338895994974
- type: euclidean_pearson
value: 52.58480559573412
- type: euclidean_spearman
value: 54.06417276612201
- type: manhattan_pearson
value: 52.69525121721343
- type: manhattan_spearman
value: 54.048147455389675
- task:
type: STS
dataset:
name: MTEB QBQTC
type: C-MTEB/QBQTC
config: default
split: test
revision: 790b0510dc52b1553e8c49f3d2afb48c0e5c48b7
metrics:
- type: cos_sim_pearson
value: 29.728387290757325
- type: cos_sim_spearman
value: 31.366121633635284
- type: euclidean_pearson
value: 29.14588368552961
- type: euclidean_spearman
value: 31.36764411112844
- type: manhattan_pearson
value: 29.63517350523121
- type: manhattan_spearman
value: 31.94157020583762
- task:
type: STS
dataset:
name: MTEB STS22 (zh)
type: mteb/sts22-crosslingual-sts
config: zh
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 63.64868296271406
- type: cos_sim_spearman
value: 66.12800618164744
- type: euclidean_pearson
value: 63.21405767340238
- type: euclidean_spearman
value: 66.12786567790748
- type: manhattan_pearson
value: 64.04300276525848
- type: manhattan_spearman
value: 66.5066857145652
- task:
type: STS
dataset:
name: MTEB STSB
type: C-MTEB/STSB
config: default
split: test
revision: 0cde68302b3541bb8b3c340dc0644b0b745b3dc0
metrics:
- type: cos_sim_pearson
value: 81.2302623912794
- type: cos_sim_spearman
value: 81.16833673266562
- type: euclidean_pearson
value: 79.47647843876024
- type: euclidean_spearman
value: 81.16944349524972
- type: manhattan_pearson
value: 79.84947238492208
- type: manhattan_spearman
value: 81.64626599410026
- task:
type: Reranking
dataset:
name: MTEB T2Reranking
type: C-MTEB/T2Reranking
config: default
split: dev
revision: 76631901a18387f85eaa53e5450019b87ad58ef9
metrics:
- type: map
value: 67.80129586475687
- type: mrr
value: 77.77402311635554
- task:
type: Retrieval
dataset:
name: MTEB T2Retrieval
type: C-MTEB/T2Retrieval
config: default
split: dev
revision: 8731a845f1bf500a4f111cf1070785c793d10e64
metrics:
- type: map_at_1
value: 28.666999999999998
- type: map_at_10
value: 81.063
- type: map_at_100
value: 84.504
- type: map_at_1000
value: 84.552
- type: map_at_3
value: 56.897
- type: map_at_5
value: 70.073
- type: mrr_at_1
value: 92.087
- type: mrr_at_10
value: 94.132
- type: mrr_at_100
value: 94.19800000000001
- type: mrr_at_1000
value: 94.19999999999999
- type: mrr_at_3
value: 93.78999999999999
- type: mrr_at_5
value: 94.002
- type: ndcg_at_1
value: 92.087
- type: ndcg_at_10
value: 87.734
- type: ndcg_at_100
value: 90.736
- type: ndcg_at_1000
value: 91.184
- type: ndcg_at_3
value: 88.78
- type: ndcg_at_5
value: 87.676
- type: precision_at_1
value: 92.087
- type: precision_at_10
value: 43.46
- type: precision_at_100
value: 5.07
- type: precision_at_1000
value: 0.518
- type: precision_at_3
value: 77.49000000000001
- type: precision_at_5
value: 65.194
- type: recall_at_1
value: 28.666999999999998
- type: recall_at_10
value: 86.632
- type: recall_at_100
value: 96.646
- type: recall_at_1000
value: 98.917
- type: recall_at_3
value: 58.333999999999996
- type: recall_at_5
value: 72.974
- task:
type: Classification
dataset:
name: MTEB TNews
type: C-MTEB/TNews-classification
config: default
split: validation
revision: 317f262bf1e6126357bbe89e875451e4b0938fe4
metrics:
- type: accuracy
value: 52.971999999999994
- type: f1
value: 50.2898280984929
- task:
type: Clustering
dataset:
name: MTEB ThuNewsClusteringP2P
type: C-MTEB/ThuNewsClusteringP2P
config: default
split: test
revision: 5798586b105c0434e4f0fe5e767abe619442cf93
metrics:
- type: v_measure
value: 86.0797948663824
- task:
type: Clustering
dataset:
name: MTEB ThuNewsClusteringS2S
type: C-MTEB/ThuNewsClusteringS2S
config: default
split: test
revision: 8a8b2caeda43f39e13c4bc5bea0f8a667896e10d
metrics:
- type: v_measure
value: 85.10759092255017
- task:
type: Retrieval
dataset:
name: MTEB VideoRetrieval
type: C-MTEB/VideoRetrieval
config: default
split: dev
revision: 58c2597a5943a2ba48f4668c3b90d796283c5639
metrics:
- type: map_at_1
value: 65.60000000000001
- type: map_at_10
value: 74.773
- type: map_at_100
value: 75.128
- type: map_at_1000
value: 75.136
- type: map_at_3
value: 73.05
- type: map_at_5
value: 74.13499999999999
- type: mrr_at_1
value: 65.60000000000001
- type: mrr_at_10
value: 74.773
- type: mrr_at_100
value: 75.128
- type: mrr_at_1000
value: 75.136
- type: mrr_at_3
value: 73.05
- type: mrr_at_5
value: 74.13499999999999
- type: ndcg_at_1
value: 65.60000000000001
- type: ndcg_at_10
value: 78.84299999999999
- type: ndcg_at_100
value: 80.40899999999999
- type: ndcg_at_1000
value: 80.57
- type: ndcg_at_3
value: 75.40599999999999
- type: ndcg_at_5
value: 77.351
- type: precision_at_1
value: 65.60000000000001
- type: precision_at_10
value: 9.139999999999999
- type: precision_at_100
value: 0.984
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 27.400000000000002
- type: precision_at_5
value: 17.380000000000003
- type: recall_at_1
value: 65.60000000000001
- type: recall_at_10
value: 91.4
- type: recall_at_100
value: 98.4
- type: recall_at_1000
value: 99.6
- type: recall_at_3
value: 82.19999999999999
- type: recall_at_5
value: 86.9
- task:
type: Classification
dataset:
name: MTEB Waimai
type: C-MTEB/waimai-classification
config: default
split: test
revision: 339287def212450dcaa9df8c22bf93e9980c7023
metrics:
- type: accuracy
value: 89.47
- type: ap
value: 75.59561751845389
- type: f1
value: 87.95207751382563
---
# sunzx0810/gte-Qwen2-7B-instruct-Q5_K_M-GGUF
This model was converted to GGUF format from [`Alibaba-NLP/gte-Qwen2-7B-instruct`](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo sunzx0810/gte-Qwen2-7B-instruct-Q5_K_M-GGUF --hf-file gte-qwen2-7b-instruct-q5_k_m.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo sunzx0810/gte-Qwen2-7B-instruct-Q5_K_M-GGUF --hf-file gte-qwen2-7b-instruct-q5_k_m.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo sunzx0810/gte-Qwen2-7B-instruct-Q5_K_M-GGUF --hf-file gte-qwen2-7b-instruct-q5_k_m.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo sunzx0810/gte-Qwen2-7B-instruct-Q5_K_M-GGUF --hf-file gte-qwen2-7b-instruct-q5_k_m.gguf -c 2048
```
| [
"SUMMARIZATION"
]
| [
"BIOSSES",
"SCIFACT"
]
| Non_BioNLP |
# sunzx0810/gte-Qwen2-7B-instruct-Q5_K_M-GGUF
This model was converted to GGUF format from [`Alibaba-NLP/gte-Qwen2-7B-instruct`](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo sunzx0810/gte-Qwen2-7B-instruct-Q5_K_M-GGUF --hf-file gte-qwen2-7b-instruct-q5_k_m.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo sunzx0810/gte-Qwen2-7B-instruct-Q5_K_M-GGUF --hf-file gte-qwen2-7b-instruct-q5_k_m.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo sunzx0810/gte-Qwen2-7B-instruct-Q5_K_M-GGUF --hf-file gte-qwen2-7b-instruct-q5_k_m.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo sunzx0810/gte-Qwen2-7B-instruct-Q5_K_M-GGUF --hf-file gte-qwen2-7b-instruct-q5_k_m.gguf -c 2048
```
| {"base_model": "Alibaba-NLP/gte-Qwen2-7B-instruct", "license": "apache-2.0", "tags": ["mteb", "sentence-transformers", "transformers", "Qwen2", "sentence-similarity", "llama-cpp", "gguf-my-repo"], "model-index": [{"name": "gte-qwen2-7B-instruct", "results": [{"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonCounterfactualClassification (en)", "type": "mteb/amazon_counterfactual", "config": "en", "split": "test", "revision": "e8379541af4e31359cca9fbcf4b00f2671dba205"}, "metrics": [{"type": "accuracy", "value": 91.31343283582089}, {"type": "ap", "value": 67.64251402604096}, {"type": "f1", "value": 87.53372530755692}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonPolarityClassification", "type": "mteb/amazon_polarity", "config": "default", "split": "test", "revision": "e2d317d38cd51312af73b3d32a06d1a08b442046"}, "metrics": [{"type": "accuracy", "value": 97.497825}, {"type": "ap", "value": 96.30329547047529}, {"type": "f1", "value": 97.49769793778039}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification (en)", "type": "mteb/amazon_reviews_multi", "config": "en", "split": "test", "revision": "1399c76144fd37290681b995c656ef9b2e06e26d"}, "metrics": [{"type": "accuracy", "value": 62.564}, {"type": "f1", "value": 60.975777935041066}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ArguAna", "type": "mteb/arguana", "config": "default", "split": "test", "revision": "c22ab2a51041ffd869aaddef7af8d8215647e41a"}, "metrics": [{"type": "map_at_1", "value": 36.486000000000004}, {"type": "map_at_10", "value": 54.842}, {"type": "map_at_100", "value": 55.206999999999994}, {"type": "map_at_1000", "value": 55.206999999999994}, {"type": "map_at_3", "value": 49.893}, {"type": "map_at_5", "value": 53.105000000000004}, {"type": "mrr_at_1", "value": 37.34}, {"type": "mrr_at_10", "value": 55.143}, {"type": "mrr_at_100", "value": 55.509}, {"type": "mrr_at_1000", "value": 55.509}, {"type": "mrr_at_3", "value": 50.212999999999994}, {"type": "mrr_at_5", "value": 53.432}, {"type": "ndcg_at_1", "value": 36.486000000000004}, {"type": "ndcg_at_10", "value": 64.273}, {"type": "ndcg_at_100", "value": 65.66199999999999}, {"type": "ndcg_at_1000", "value": 65.66199999999999}, {"type": "ndcg_at_3", "value": 54.352999999999994}, {"type": "ndcg_at_5", "value": 60.131}, {"type": "precision_at_1", "value": 36.486000000000004}, {"type": "precision_at_10", "value": 9.395000000000001}, {"type": "precision_at_100", "value": 0.996}, {"type": "precision_at_1000", "value": 0.1}, {"type": "precision_at_3", "value": 22.428}, {"type": "precision_at_5", "value": 16.259}, {"type": "recall_at_1", "value": 36.486000000000004}, {"type": "recall_at_10", "value": 93.95400000000001}, {"type": "recall_at_100", "value": 99.644}, {"type": "recall_at_1000", "value": 99.644}, {"type": "recall_at_3", "value": 67.283}, {"type": "recall_at_5", "value": 81.294}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringP2P", "type": "mteb/arxiv-clustering-p2p", "config": "default", "split": "test", "revision": "a122ad7f3f0291bf49cc6f4d32aa80929df69d5d"}, "metrics": [{"type": "v_measure", "value": 56.461169803700564}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringS2S", "type": "mteb/arxiv-clustering-s2s", "config": "default", "split": "test", "revision": "f910caf1a6075f7329cdf8c1a6135696f37dbd53"}, "metrics": [{"type": "v_measure", "value": 51.73600434466286}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB AskUbuntuDupQuestions", "type": "mteb/askubuntudupquestions-reranking", "config": "default", "split": "test", "revision": "2000358ca161889fa9c082cb41daa8dcfb161a54"}, "metrics": [{"type": "map", "value": 67.57827065898053}, {"type": "mrr", "value": 79.08136569493911}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB BIOSSES", "type": "mteb/biosses-sts", "config": "default", "split": "test", "revision": "d3fb88f8f02e40887cd149695127462bbcf29b4a"}, "metrics": [{"type": "cos_sim_pearson", "value": 83.53324575999243}, {"type": "cos_sim_spearman", "value": 81.37173362822374}, {"type": "euclidean_pearson", "value": 82.19243335103444}, {"type": "euclidean_spearman", "value": 81.33679307304334}, {"type": "manhattan_pearson", "value": 82.38752665975699}, {"type": "manhattan_spearman", "value": 81.31510583189689}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB Banking77Classification", "type": "mteb/banking77", "config": "default", "split": "test", "revision": "0fd18e25b25c072e09e0d92ab615fda904d66300"}, "metrics": [{"type": "accuracy", "value": 87.56818181818181}, {"type": "f1", "value": 87.25826722019875}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringP2P", "type": "mteb/biorxiv-clustering-p2p", "config": "default", "split": "test", "revision": "65b79d1d13f80053f67aca9498d9402c2d9f1f40"}, "metrics": [{"type": "v_measure", "value": 50.09239610327673}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringS2S", "type": "mteb/biorxiv-clustering-s2s", "config": "default", "split": "test", "revision": "258694dd0231531bc1fd9de6ceb52a0853c6d908"}, "metrics": [{"type": "v_measure", "value": 46.64733054606282}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackAndroidRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "f46a197baaae43b4f621051089b82a364682dfeb"}, "metrics": [{"type": "map_at_1", "value": 33.997}, {"type": "map_at_10", "value": 48.176}, {"type": "map_at_100", "value": 49.82}, {"type": "map_at_1000", "value": 49.924}, {"type": "map_at_3", "value": 43.626}, {"type": "map_at_5", "value": 46.275}, {"type": "mrr_at_1", "value": 42.059999999999995}, {"type": "mrr_at_10", "value": 53.726}, {"type": "mrr_at_100", "value": 54.398}, {"type": "mrr_at_1000", "value": 54.416}, {"type": "mrr_at_3", "value": 50.714999999999996}, {"type": "mrr_at_5", "value": 52.639}, {"type": "ndcg_at_1", "value": 42.059999999999995}, {"type": "ndcg_at_10", "value": 55.574999999999996}, {"type": "ndcg_at_100", "value": 60.744}, {"type": "ndcg_at_1000", "value": 61.85699999999999}, {"type": "ndcg_at_3", "value": 49.363}, {"type": "ndcg_at_5", "value": 52.44}, {"type": "precision_at_1", "value": 42.059999999999995}, {"type": "precision_at_10", "value": 11.101999999999999}, {"type": "precision_at_100", "value": 1.73}, {"type": "precision_at_1000", "value": 0.218}, {"type": "precision_at_3", "value": 24.464}, {"type": "precision_at_5", "value": 18.026}, {"type": "recall_at_1", "value": 33.997}, {"type": "recall_at_10", "value": 70.35900000000001}, {"type": "recall_at_100", "value": 91.642}, {"type": "recall_at_1000", "value": 97.977}, {"type": "recall_at_3", "value": 52.76}, {"type": "recall_at_5", "value": 61.148}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackEnglishRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "ad9991cb51e31e31e430383c75ffb2885547b5f0"}, "metrics": [{"type": "map_at_1", "value": 35.884}, {"type": "map_at_10", "value": 48.14}, {"type": "map_at_100", "value": 49.5}, {"type": "map_at_1000", "value": 49.63}, {"type": "map_at_3", "value": 44.646}, {"type": "map_at_5", "value": 46.617999999999995}, {"type": "mrr_at_1", "value": 44.458999999999996}, {"type": "mrr_at_10", "value": 53.751000000000005}, {"type": "mrr_at_100", "value": 54.37800000000001}, {"type": "mrr_at_1000", "value": 54.415}, {"type": "mrr_at_3", "value": 51.815}, {"type": "mrr_at_5", "value": 52.882}, {"type": "ndcg_at_1", "value": 44.458999999999996}, {"type": "ndcg_at_10", "value": 54.157}, {"type": "ndcg_at_100", "value": 58.362}, {"type": "ndcg_at_1000", "value": 60.178}, {"type": "ndcg_at_3", "value": 49.661}, {"type": "ndcg_at_5", "value": 51.74999999999999}, {"type": "precision_at_1", "value": 44.458999999999996}, {"type": "precision_at_10", "value": 10.248}, {"type": "precision_at_100", "value": 1.5890000000000002}, {"type": "precision_at_1000", "value": 0.207}, {"type": "precision_at_3", "value": 23.928}, {"type": "precision_at_5", "value": 16.878999999999998}, {"type": "recall_at_1", "value": 35.884}, {"type": "recall_at_10", "value": 64.798}, {"type": "recall_at_100", "value": 82.345}, {"type": "recall_at_1000", "value": 93.267}, {"type": "recall_at_3", "value": 51.847}, {"type": "recall_at_5", "value": 57.601}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGamingRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "4885aa143210c98657558c04aaf3dc47cfb54340"}, "metrics": [{"type": "map_at_1", "value": 39.383}, {"type": "map_at_10", "value": 53.714}, {"type": "map_at_100", "value": 54.838}, {"type": "map_at_1000", "value": 54.87800000000001}, {"type": "map_at_3", "value": 50.114999999999995}, {"type": "map_at_5", "value": 52.153000000000006}, {"type": "mrr_at_1", "value": 45.016}, {"type": "mrr_at_10", "value": 56.732000000000006}, {"type": "mrr_at_100", "value": 57.411}, {"type": "mrr_at_1000", "value": 57.431}, {"type": "mrr_at_3", "value": 54.044000000000004}, {"type": "mrr_at_5", "value": 55.639}, {"type": "ndcg_at_1", "value": 45.016}, {"type": "ndcg_at_10", "value": 60.228}, {"type": "ndcg_at_100", "value": 64.277}, {"type": "ndcg_at_1000", "value": 65.07}, {"type": "ndcg_at_3", "value": 54.124}, {"type": "ndcg_at_5", "value": 57.147000000000006}, {"type": "precision_at_1", "value": 45.016}, {"type": "precision_at_10", "value": 9.937}, {"type": "precision_at_100", "value": 1.288}, {"type": "precision_at_1000", "value": 0.13899999999999998}, {"type": "precision_at_3", "value": 24.471999999999998}, {"type": "precision_at_5", "value": 16.991}, {"type": "recall_at_1", "value": 39.383}, {"type": "recall_at_10", "value": 76.175}, {"type": "recall_at_100", "value": 93.02}, {"type": "recall_at_1000", "value": 98.60900000000001}, {"type": "recall_at_3", "value": 60.265}, {"type": "recall_at_5", "value": 67.46600000000001}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGisRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "5003b3064772da1887988e05400cf3806fe491f2"}, "metrics": [{"type": "map_at_1", "value": 27.426000000000002}, {"type": "map_at_10", "value": 37.397000000000006}, {"type": "map_at_100", "value": 38.61}, {"type": "map_at_1000", "value": 38.678000000000004}, {"type": "map_at_3", "value": 34.150999999999996}, {"type": "map_at_5", "value": 36.137}, {"type": "mrr_at_1", "value": 29.944}, {"type": "mrr_at_10", "value": 39.654}, {"type": "mrr_at_100", "value": 40.638000000000005}, {"type": "mrr_at_1000", "value": 40.691}, {"type": "mrr_at_3", "value": 36.817}, {"type": "mrr_at_5", "value": 38.524}, {"type": "ndcg_at_1", "value": 29.944}, {"type": "ndcg_at_10", "value": 43.094}, {"type": "ndcg_at_100", "value": 48.789}, {"type": "ndcg_at_1000", "value": 50.339999999999996}, {"type": "ndcg_at_3", "value": 36.984}, {"type": "ndcg_at_5", "value": 40.248}, {"type": "precision_at_1", "value": 29.944}, {"type": "precision_at_10", "value": 6.78}, {"type": "precision_at_100", "value": 1.024}, {"type": "precision_at_1000", "value": 0.11800000000000001}, {"type": "precision_at_3", "value": 15.895000000000001}, {"type": "precision_at_5", "value": 11.39}, {"type": "recall_at_1", "value": 27.426000000000002}, {"type": "recall_at_10", "value": 58.464000000000006}, {"type": "recall_at_100", "value": 84.193}, {"type": "recall_at_1000", "value": 95.52000000000001}, {"type": "recall_at_3", "value": 42.172}, {"type": "recall_at_5", "value": 50.101}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackMathematicaRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "90fceea13679c63fe563ded68f3b6f06e50061de"}, "metrics": [{"type": "map_at_1", "value": 19.721}, {"type": "map_at_10", "value": 31.604}, {"type": "map_at_100", "value": 32.972}, {"type": "map_at_1000", "value": 33.077}, {"type": "map_at_3", "value": 27.218999999999998}, {"type": "map_at_5", "value": 29.53}, {"type": "mrr_at_1", "value": 25.0}, {"type": "mrr_at_10", "value": 35.843}, {"type": "mrr_at_100", "value": 36.785000000000004}, {"type": "mrr_at_1000", "value": 36.842000000000006}, {"type": "mrr_at_3", "value": 32.193}, {"type": "mrr_at_5", "value": 34.264}, {"type": "ndcg_at_1", "value": 25.0}, {"type": "ndcg_at_10", "value": 38.606}, {"type": "ndcg_at_100", "value": 44.272}, {"type": "ndcg_at_1000", "value": 46.527}, {"type": "ndcg_at_3", "value": 30.985000000000003}, {"type": "ndcg_at_5", "value": 34.43}, {"type": "precision_at_1", "value": 25.0}, {"type": "precision_at_10", "value": 7.811}, {"type": "precision_at_100", "value": 1.203}, {"type": "precision_at_1000", "value": 0.15}, {"type": "precision_at_3", "value": 15.423}, {"type": "precision_at_5", "value": 11.791}, {"type": "recall_at_1", "value": 19.721}, {"type": "recall_at_10", "value": 55.625}, {"type": "recall_at_100", "value": 79.34400000000001}, {"type": "recall_at_1000", "value": 95.208}, {"type": "recall_at_3", "value": 35.19}, {"type": "recall_at_5", "value": 43.626}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackPhysicsRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4"}, "metrics": [{"type": "map_at_1", "value": 33.784}, {"type": "map_at_10", "value": 47.522}, {"type": "map_at_100", "value": 48.949999999999996}, {"type": "map_at_1000", "value": 49.038}, {"type": "map_at_3", "value": 43.284}, {"type": "map_at_5", "value": 45.629}, {"type": "mrr_at_1", "value": 41.482}, {"type": "mrr_at_10", "value": 52.830999999999996}, {"type": "mrr_at_100", "value": 53.559999999999995}, {"type": "mrr_at_1000", "value": 53.588}, {"type": "mrr_at_3", "value": 50.016000000000005}, {"type": "mrr_at_5", "value": 51.614000000000004}, {"type": "ndcg_at_1", "value": 41.482}, {"type": "ndcg_at_10", "value": 54.569}, {"type": "ndcg_at_100", "value": 59.675999999999995}, {"type": "ndcg_at_1000", "value": 60.989000000000004}, {"type": "ndcg_at_3", "value": 48.187000000000005}, {"type": "ndcg_at_5", "value": 51.183}, {"type": "precision_at_1", "value": 41.482}, {"type": "precision_at_10", "value": 10.221}, {"type": "precision_at_100", "value": 1.486}, {"type": "precision_at_1000", "value": 0.17500000000000002}, {"type": "precision_at_3", "value": 23.548}, {"type": "precision_at_5", "value": 16.805}, {"type": "recall_at_1", "value": 33.784}, {"type": "recall_at_10", "value": 69.798}, {"type": "recall_at_100", "value": 90.098}, {"type": "recall_at_1000", "value": 98.176}, {"type": "recall_at_3", "value": 52.127}, {"type": "recall_at_5", "value": 59.861}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackProgrammersRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "6184bc1440d2dbc7612be22b50686b8826d22b32"}, "metrics": [{"type": "map_at_1", "value": 28.038999999999998}, {"type": "map_at_10", "value": 41.904}, {"type": "map_at_100", "value": 43.36}, {"type": "map_at_1000", "value": 43.453}, {"type": "map_at_3", "value": 37.785999999999994}, {"type": "map_at_5", "value": 40.105000000000004}, {"type": "mrr_at_1", "value": 35.046}, {"type": "mrr_at_10", "value": 46.926}, {"type": "mrr_at_100", "value": 47.815000000000005}, {"type": "mrr_at_1000", "value": 47.849000000000004}, {"type": "mrr_at_3", "value": 44.273}, {"type": "mrr_at_5", "value": 45.774}, {"type": "ndcg_at_1", "value": 35.046}, {"type": "ndcg_at_10", "value": 48.937000000000005}, {"type": "ndcg_at_100", "value": 54.544000000000004}, {"type": "ndcg_at_1000", "value": 56.069}, {"type": "ndcg_at_3", "value": 42.858000000000004}, {"type": "ndcg_at_5", "value": 45.644}, {"type": "precision_at_1", "value": 35.046}, {"type": "precision_at_10", "value": 9.452}, {"type": "precision_at_100", "value": 1.429}, {"type": "precision_at_1000", "value": 0.173}, {"type": "precision_at_3", "value": 21.346999999999998}, {"type": "precision_at_5", "value": 15.342}, {"type": "recall_at_1", "value": 28.038999999999998}, {"type": "recall_at_10", "value": 64.59700000000001}, {"type": "recall_at_100", "value": 87.735}, {"type": "recall_at_1000", "value": 97.41300000000001}, {"type": "recall_at_3", "value": 47.368}, {"type": "recall_at_5", "value": 54.93900000000001}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "4ffe81d471b1924886b33c7567bfb200e9eec5c4"}, "metrics": [{"type": "map_at_1", "value": 28.17291666666667}, {"type": "map_at_10", "value": 40.025749999999995}, {"type": "map_at_100", "value": 41.39208333333333}, {"type": "map_at_1000", "value": 41.499249999999996}, {"type": "map_at_3", "value": 36.347}, {"type": "map_at_5", "value": 38.41391666666667}, {"type": "mrr_at_1", "value": 33.65925}, {"type": "mrr_at_10", "value": 44.085499999999996}, {"type": "mrr_at_100", "value": 44.94116666666667}, {"type": "mrr_at_1000", "value": 44.9855}, {"type": "mrr_at_3", "value": 41.2815}, {"type": "mrr_at_5", "value": 42.91491666666666}, {"type": "ndcg_at_1", "value": 33.65925}, {"type": "ndcg_at_10", "value": 46.430833333333325}, {"type": "ndcg_at_100", "value": 51.761}, {"type": "ndcg_at_1000", "value": 53.50899999999999}, {"type": "ndcg_at_3", "value": 40.45133333333333}, {"type": "ndcg_at_5", "value": 43.31483333333334}, {"type": "precision_at_1", "value": 33.65925}, {"type": "precision_at_10", "value": 8.4995}, {"type": "precision_at_100", "value": 1.3210000000000004}, {"type": "precision_at_1000", "value": 0.16591666666666666}, {"type": "precision_at_3", "value": 19.165083333333335}, {"type": "precision_at_5", "value": 13.81816666666667}, {"type": "recall_at_1", "value": 28.17291666666667}, {"type": "recall_at_10", "value": 61.12624999999999}, {"type": "recall_at_100", "value": 83.97266666666667}, {"type": "recall_at_1000", "value": 95.66550000000001}, {"type": "recall_at_3", "value": 44.661249999999995}, {"type": "recall_at_5", "value": 51.983333333333334}, {"type": "map_at_1", "value": 17.936}, {"type": "map_at_10", "value": 27.399}, {"type": "map_at_100", "value": 28.632}, {"type": "map_at_1000", "value": 28.738000000000003}, {"type": "map_at_3", "value": 24.456}, {"type": "map_at_5", "value": 26.06}, {"type": "mrr_at_1", "value": 19.224}, {"type": "mrr_at_10", "value": 28.998}, {"type": "mrr_at_100", "value": 30.11}, {"type": "mrr_at_1000", "value": 30.177}, {"type": "mrr_at_3", "value": 26.247999999999998}, {"type": "mrr_at_5", "value": 27.708}, {"type": "ndcg_at_1", "value": 19.224}, {"type": "ndcg_at_10", "value": 32.911}, {"type": "ndcg_at_100", "value": 38.873999999999995}, {"type": "ndcg_at_1000", "value": 41.277}, {"type": "ndcg_at_3", "value": 27.142}, {"type": "ndcg_at_5", "value": 29.755}, {"type": "precision_at_1", "value": 19.224}, {"type": "precision_at_10", "value": 5.6930000000000005}, {"type": "precision_at_100", "value": 0.9259999999999999}, {"type": "precision_at_1000", "value": 0.126}, {"type": "precision_at_3", "value": 12.138}, {"type": "precision_at_5", "value": 8.909}, {"type": "recall_at_1", "value": 17.936}, {"type": "recall_at_10", "value": 48.096}, {"type": "recall_at_100", "value": 75.389}, {"type": "recall_at_1000", "value": 92.803}, {"type": "recall_at_3", "value": 32.812999999999995}, {"type": "recall_at_5", "value": 38.851}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackStatsRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "65ac3a16b8e91f9cee4c9828cc7c335575432a2a"}, "metrics": [{"type": "map_at_1", "value": 24.681}, {"type": "map_at_10", "value": 34.892}, {"type": "map_at_100", "value": 35.996}, {"type": "map_at_1000", "value": 36.083}, {"type": "map_at_3", "value": 31.491999999999997}, {"type": "map_at_5", "value": 33.632}, {"type": "mrr_at_1", "value": 28.528}, {"type": "mrr_at_10", "value": 37.694}, {"type": "mrr_at_100", "value": 38.613}, {"type": "mrr_at_1000", "value": 38.668}, {"type": "mrr_at_3", "value": 34.714}, {"type": "mrr_at_5", "value": 36.616}, {"type": "ndcg_at_1", "value": 28.528}, {"type": "ndcg_at_10", "value": 40.703}, {"type": "ndcg_at_100", "value": 45.993}, {"type": "ndcg_at_1000", "value": 47.847}, {"type": "ndcg_at_3", "value": 34.622}, {"type": "ndcg_at_5", "value": 38.035999999999994}, {"type": "precision_at_1", "value": 28.528}, {"type": "precision_at_10", "value": 6.902}, {"type": "precision_at_100", "value": 1.0370000000000001}, {"type": "precision_at_1000", "value": 0.126}, {"type": "precision_at_3", "value": 15.798000000000002}, {"type": "precision_at_5", "value": 11.655999999999999}, {"type": "recall_at_1", "value": 24.681}, {"type": "recall_at_10", "value": 55.81}, {"type": "recall_at_100", "value": 79.785}, {"type": "recall_at_1000", "value": 92.959}, {"type": "recall_at_3", "value": 39.074}, {"type": "recall_at_5", "value": 47.568}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackTexRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "46989137a86843e03a6195de44b09deda022eec7"}, "metrics": [{"type": "map_at_1", "value": 18.627}, {"type": "map_at_10", "value": 27.872000000000003}, {"type": "map_at_100", "value": 29.237999999999996}, {"type": "map_at_1000", "value": 29.363}, {"type": "map_at_3", "value": 24.751}, {"type": "map_at_5", "value": 26.521}, {"type": "mrr_at_1", "value": 23.021}, {"type": "mrr_at_10", "value": 31.924000000000003}, {"type": "mrr_at_100", "value": 32.922000000000004}, {"type": "mrr_at_1000", "value": 32.988}, {"type": "mrr_at_3", "value": 29.192}, {"type": "mrr_at_5", "value": 30.798}, {"type": "ndcg_at_1", "value": 23.021}, {"type": "ndcg_at_10", "value": 33.535}, {"type": "ndcg_at_100", "value": 39.732}, {"type": "ndcg_at_1000", "value": 42.201}, {"type": "ndcg_at_3", "value": 28.153}, {"type": "ndcg_at_5", "value": 30.746000000000002}, {"type": "precision_at_1", "value": 23.021}, {"type": "precision_at_10", "value": 6.459}, {"type": "precision_at_100", "value": 1.1320000000000001}, {"type": "precision_at_1000", "value": 0.153}, {"type": "precision_at_3", "value": 13.719000000000001}, {"type": "precision_at_5", "value": 10.193000000000001}, {"type": "recall_at_1", "value": 18.627}, {"type": "recall_at_10", "value": 46.463}, {"type": "recall_at_100", "value": 74.226}, {"type": "recall_at_1000", "value": 91.28500000000001}, {"type": "recall_at_3", "value": 31.357000000000003}, {"type": "recall_at_5", "value": 38.067}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackUnixRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "6c6430d3a6d36f8d2a829195bc5dc94d7e063e53"}, "metrics": [{"type": "map_at_1", "value": 31.457}, {"type": "map_at_10", "value": 42.888}, {"type": "map_at_100", "value": 44.24}, {"type": "map_at_1000", "value": 44.327}, {"type": "map_at_3", "value": 39.588}, {"type": "map_at_5", "value": 41.423}, {"type": "mrr_at_1", "value": 37.126999999999995}, {"type": "mrr_at_10", "value": 47.083000000000006}, {"type": "mrr_at_100", "value": 47.997}, {"type": "mrr_at_1000", "value": 48.044}, {"type": "mrr_at_3", "value": 44.574000000000005}, {"type": "mrr_at_5", "value": 46.202}, {"type": "ndcg_at_1", "value": 37.126999999999995}, {"type": "ndcg_at_10", "value": 48.833}, {"type": "ndcg_at_100", "value": 54.327000000000005}, {"type": "ndcg_at_1000", "value": 56.011}, {"type": "ndcg_at_3", "value": 43.541999999999994}, {"type": "ndcg_at_5", "value": 46.127}, {"type": "precision_at_1", "value": 37.126999999999995}, {"type": "precision_at_10", "value": 8.376999999999999}, {"type": "precision_at_100", "value": 1.2309999999999999}, {"type": "precision_at_1000", "value": 0.146}, {"type": "precision_at_3", "value": 20.211000000000002}, {"type": "precision_at_5", "value": 14.16}, {"type": "recall_at_1", "value": 31.457}, {"type": "recall_at_10", "value": 62.369}, {"type": "recall_at_100", "value": 85.444}, {"type": "recall_at_1000", "value": 96.65599999999999}, {"type": "recall_at_3", "value": 47.961}, {"type": "recall_at_5", "value": 54.676}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWebmastersRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "160c094312a0e1facb97e55eeddb698c0abe3571"}, "metrics": [{"type": "map_at_1", "value": 27.139999999999997}, {"type": "map_at_10", "value": 38.801}, {"type": "map_at_100", "value": 40.549}, {"type": "map_at_1000", "value": 40.802}, {"type": "map_at_3", "value": 35.05}, {"type": "map_at_5", "value": 36.884}, {"type": "mrr_at_1", "value": 33.004}, {"type": "mrr_at_10", "value": 43.864}, {"type": "mrr_at_100", "value": 44.667}, {"type": "mrr_at_1000", "value": 44.717}, {"type": "mrr_at_3", "value": 40.777}, {"type": "mrr_at_5", "value": 42.319}, {"type": "ndcg_at_1", "value": 33.004}, {"type": "ndcg_at_10", "value": 46.022}, {"type": "ndcg_at_100", "value": 51.542}, {"type": "ndcg_at_1000", "value": 53.742000000000004}, {"type": "ndcg_at_3", "value": 39.795}, {"type": "ndcg_at_5", "value": 42.272}, {"type": "precision_at_1", "value": 33.004}, {"type": "precision_at_10", "value": 9.012}, {"type": "precision_at_100", "value": 1.7770000000000001}, {"type": "precision_at_1000", "value": 0.26}, {"type": "precision_at_3", "value": 19.038}, {"type": "precision_at_5", "value": 13.675999999999998}, {"type": "recall_at_1", "value": 27.139999999999997}, {"type": "recall_at_10", "value": 60.961}, {"type": "recall_at_100", "value": 84.451}, {"type": "recall_at_1000", "value": 98.113}, {"type": "recall_at_3", "value": 43.001}, {"type": "recall_at_5", "value": 49.896}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ClimateFEVER", "type": "mteb/climate-fever", "config": "default", "split": "test", "revision": "47f2ac6acb640fc46020b02a5b59fdda04d39380"}, "metrics": [{"type": "map_at_1", "value": 22.076999999999998}, {"type": "map_at_10", "value": 35.44}, {"type": "map_at_100", "value": 37.651}, {"type": "map_at_1000", "value": 37.824999999999996}, {"type": "map_at_3", "value": 30.764999999999997}, {"type": "map_at_5", "value": 33.26}, {"type": "mrr_at_1", "value": 50.163000000000004}, {"type": "mrr_at_10", "value": 61.207}, {"type": "mrr_at_100", "value": 61.675000000000004}, {"type": "mrr_at_1000", "value": 61.692}, {"type": "mrr_at_3", "value": 58.60999999999999}, {"type": "mrr_at_5", "value": 60.307}, {"type": "ndcg_at_1", "value": 50.163000000000004}, {"type": "ndcg_at_10", "value": 45.882}, {"type": "ndcg_at_100", "value": 53.239999999999995}, {"type": "ndcg_at_1000", "value": 55.852000000000004}, {"type": "ndcg_at_3", "value": 40.514}, {"type": "ndcg_at_5", "value": 42.038}, {"type": "precision_at_1", "value": 50.163000000000004}, {"type": "precision_at_10", "value": 13.466000000000001}, {"type": "precision_at_100", "value": 2.164}, {"type": "precision_at_1000", "value": 0.266}, {"type": "precision_at_3", "value": 29.707}, {"type": "precision_at_5", "value": 21.694}, {"type": "recall_at_1", "value": 22.076999999999998}, {"type": "recall_at_10", "value": 50.193}, {"type": "recall_at_100", "value": 74.993}, {"type": "recall_at_1000", "value": 89.131}, {"type": "recall_at_3", "value": 35.472}, {"type": "recall_at_5", "value": 41.814}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB DBPedia", "type": "mteb/dbpedia", "config": "default", "split": "test", "revision": "c0f706b76e590d620bd6618b3ca8efdd34e2d659"}, "metrics": [{"type": "map_at_1", "value": 9.953}, {"type": "map_at_10", "value": 24.515}, {"type": "map_at_100", "value": 36.173}, {"type": "map_at_1000", "value": 38.351}, {"type": "map_at_3", "value": 16.592000000000002}, {"type": "map_at_5", "value": 20.036}, {"type": "mrr_at_1", "value": 74.25}, {"type": "mrr_at_10", "value": 81.813}, {"type": "mrr_at_100", "value": 82.006}, {"type": "mrr_at_1000", "value": 82.011}, {"type": "mrr_at_3", "value": 80.875}, {"type": "mrr_at_5", "value": 81.362}, {"type": "ndcg_at_1", "value": 62.5}, {"type": "ndcg_at_10", "value": 52.42}, {"type": "ndcg_at_100", "value": 56.808}, {"type": "ndcg_at_1000", "value": 63.532999999999994}, {"type": "ndcg_at_3", "value": 56.654}, {"type": "ndcg_at_5", "value": 54.18300000000001}, {"type": "precision_at_1", "value": 74.25}, {"type": "precision_at_10", "value": 42.699999999999996}, {"type": "precision_at_100", "value": 13.675}, {"type": "precision_at_1000", "value": 2.664}, {"type": "precision_at_3", "value": 60.5}, {"type": "precision_at_5", "value": 52.800000000000004}, {"type": "recall_at_1", "value": 9.953}, {"type": "recall_at_10", "value": 30.253999999999998}, {"type": "recall_at_100", "value": 62.516000000000005}, {"type": "recall_at_1000", "value": 84.163}, {"type": "recall_at_3", "value": 18.13}, {"type": "recall_at_5", "value": 22.771}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB EmotionClassification", "type": "mteb/emotion", "config": "default", "split": "test", "revision": "4f58c6b202a23cf9a4da393831edf4f9183cad37"}, "metrics": [{"type": "accuracy", "value": 79.455}, {"type": "f1", "value": 74.16798697647569}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FEVER", "type": "mteb/fever", "config": "default", "split": "test", "revision": "bea83ef9e8fb933d90a2f1d5515737465d613e12"}, "metrics": [{"type": "map_at_1", "value": 87.531}, {"type": "map_at_10", "value": 93.16799999999999}, {"type": "map_at_100", "value": 93.341}, {"type": "map_at_1000", "value": 93.349}, {"type": "map_at_3", "value": 92.444}, {"type": "map_at_5", "value": 92.865}, {"type": "mrr_at_1", "value": 94.014}, {"type": "mrr_at_10", "value": 96.761}, {"type": "mrr_at_100", "value": 96.762}, {"type": "mrr_at_1000", "value": 96.762}, {"type": "mrr_at_3", "value": 96.672}, {"type": "mrr_at_5", "value": 96.736}, {"type": "ndcg_at_1", "value": 94.014}, {"type": "ndcg_at_10", "value": 95.112}, {"type": "ndcg_at_100", "value": 95.578}, {"type": "ndcg_at_1000", "value": 95.68900000000001}, {"type": "ndcg_at_3", "value": 94.392}, {"type": "ndcg_at_5", "value": 94.72500000000001}, {"type": "precision_at_1", "value": 94.014}, {"type": "precision_at_10", "value": 11.065}, {"type": "precision_at_100", "value": 1.157}, {"type": "precision_at_1000", "value": 0.11800000000000001}, {"type": "precision_at_3", "value": 35.259}, {"type": "precision_at_5", "value": 21.599}, {"type": "recall_at_1", "value": 87.531}, {"type": "recall_at_10", "value": 97.356}, {"type": "recall_at_100", "value": 98.965}, {"type": "recall_at_1000", "value": 99.607}, {"type": "recall_at_3", "value": 95.312}, {"type": "recall_at_5", "value": 96.295}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FiQA2018", "type": "mteb/fiqa", "config": "default", "split": "test", "revision": "27a168819829fe9bcd655c2df245fb19452e8e06"}, "metrics": [{"type": "map_at_1", "value": 32.055}, {"type": "map_at_10", "value": 53.114}, {"type": "map_at_100", "value": 55.235}, {"type": "map_at_1000", "value": 55.345}, {"type": "map_at_3", "value": 45.854}, {"type": "map_at_5", "value": 50.025}, {"type": "mrr_at_1", "value": 60.34}, {"type": "mrr_at_10", "value": 68.804}, {"type": "mrr_at_100", "value": 69.309}, {"type": "mrr_at_1000", "value": 69.32199999999999}, {"type": "mrr_at_3", "value": 66.40899999999999}, {"type": "mrr_at_5", "value": 67.976}, {"type": "ndcg_at_1", "value": 60.34}, {"type": "ndcg_at_10", "value": 62.031000000000006}, {"type": "ndcg_at_100", "value": 68.00500000000001}, {"type": "ndcg_at_1000", "value": 69.286}, {"type": "ndcg_at_3", "value": 56.355999999999995}, {"type": "ndcg_at_5", "value": 58.687}, {"type": "precision_at_1", "value": 60.34}, {"type": "precision_at_10", "value": 17.176}, {"type": "precision_at_100", "value": 2.36}, {"type": "precision_at_1000", "value": 0.259}, {"type": "precision_at_3", "value": 37.14}, {"type": "precision_at_5", "value": 27.809}, {"type": "recall_at_1", "value": 32.055}, {"type": "recall_at_10", "value": 70.91}, {"type": "recall_at_100", "value": 91.83}, {"type": "recall_at_1000", "value": 98.871}, {"type": "recall_at_3", "value": 51.202999999999996}, {"type": "recall_at_5", "value": 60.563}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB HotpotQA", "type": "mteb/hotpotqa", "config": "default", "split": "test", "revision": "ab518f4d6fcca38d87c25209f94beba119d02014"}, "metrics": [{"type": "map_at_1", "value": 43.68}, {"type": "map_at_10", "value": 64.389}, {"type": "map_at_100", "value": 65.24}, {"type": "map_at_1000", "value": 65.303}, {"type": "map_at_3", "value": 61.309000000000005}, {"type": "map_at_5", "value": 63.275999999999996}, {"type": "mrr_at_1", "value": 87.36}, {"type": "mrr_at_10", "value": 91.12}, {"type": "mrr_at_100", "value": 91.227}, {"type": "mrr_at_1000", "value": 91.229}, {"type": "mrr_at_3", "value": 90.57600000000001}, {"type": "mrr_at_5", "value": 90.912}, {"type": "ndcg_at_1", "value": 87.36}, {"type": "ndcg_at_10", "value": 73.076}, {"type": "ndcg_at_100", "value": 75.895}, {"type": "ndcg_at_1000", "value": 77.049}, {"type": "ndcg_at_3", "value": 68.929}, {"type": "ndcg_at_5", "value": 71.28}, {"type": "precision_at_1", "value": 87.36}, {"type": "precision_at_10", "value": 14.741000000000001}, {"type": "precision_at_100", "value": 1.694}, {"type": "precision_at_1000", "value": 0.185}, {"type": "precision_at_3", "value": 43.043}, {"type": "precision_at_5", "value": 27.681}, {"type": "recall_at_1", "value": 43.68}, {"type": "recall_at_10", "value": 73.707}, {"type": "recall_at_100", "value": 84.7}, {"type": "recall_at_1000", "value": 92.309}, {"type": "recall_at_3", "value": 64.564}, {"type": "recall_at_5", "value": 69.203}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ImdbClassification", "type": "mteb/imdb", "config": "default", "split": "test", "revision": "3d86128a09e091d6018b6d26cad27f2739fc2db7"}, "metrics": [{"type": "accuracy", "value": 96.75399999999999}, {"type": "ap", "value": 95.29389839242187}, {"type": "f1", "value": 96.75348377433475}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MSMARCO", "type": "mteb/msmarco", "config": "default", "split": "dev", "revision": "c5a29a104738b98a9e76336939199e264163d4a0"}, "metrics": [{"type": "map_at_1", "value": 25.176}, {"type": "map_at_10", "value": 38.598}, {"type": "map_at_100", "value": 39.707}, {"type": "map_at_1000", "value": 39.744}, {"type": "map_at_3", "value": 34.566}, {"type": "map_at_5", "value": 36.863}, {"type": "mrr_at_1", "value": 25.874000000000002}, {"type": "mrr_at_10", "value": 39.214}, {"type": "mrr_at_100", "value": 40.251}, {"type": "mrr_at_1000", "value": 40.281}, {"type": "mrr_at_3", "value": 35.291}, {"type": "mrr_at_5", "value": 37.545}, {"type": "ndcg_at_1", "value": 25.874000000000002}, {"type": "ndcg_at_10", "value": 45.98}, {"type": "ndcg_at_100", "value": 51.197}, {"type": "ndcg_at_1000", "value": 52.073}, {"type": "ndcg_at_3", "value": 37.785999999999994}, {"type": "ndcg_at_5", "value": 41.870000000000005}, {"type": "precision_at_1", "value": 25.874000000000002}, {"type": "precision_at_10", "value": 7.181}, {"type": "precision_at_100", "value": 0.979}, {"type": "precision_at_1000", "value": 0.106}, {"type": "precision_at_3", "value": 16.051000000000002}, {"type": "precision_at_5", "value": 11.713}, {"type": "recall_at_1", "value": 25.176}, {"type": "recall_at_10", "value": 68.67699999999999}, {"type": "recall_at_100", "value": 92.55}, {"type": "recall_at_1000", "value": 99.164}, {"type": "recall_at_3", "value": 46.372}, {"type": "recall_at_5", "value": 56.16}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPDomainClassification (en)", "type": "mteb/mtop_domain", "config": "en", "split": "test", "revision": "d80d48c1eb48d3562165c59d59d0034df9fff0bf"}, "metrics": [{"type": "accuracy", "value": 99.03784769721841}, {"type": "f1", "value": 98.97791641821495}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPIntentClassification (en)", "type": "mteb/mtop_intent", "config": "en", "split": "test", "revision": "ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba"}, "metrics": [{"type": "accuracy", "value": 91.88326493388054}, {"type": "f1", "value": 73.74809928034335}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (en)", "type": "mteb/amazon_massive_intent", "config": "en", "split": "test", "revision": "31efe3c427b0bae9c22cbb560b8f15491cc6bed7"}, "metrics": [{"type": "accuracy", "value": 85.41358439811701}, {"type": "f1", "value": 83.503679460639}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (en)", "type": "mteb/amazon_massive_scenario", "config": "en", "split": "test", "revision": "7d571f92784cd94a019292a1f45445077d0ef634"}, "metrics": [{"type": "accuracy", "value": 89.77135171486215}, {"type": "f1", "value": 88.89843747468366}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringP2P", "type": "mteb/medrxiv-clustering-p2p", "config": "default", "split": "test", "revision": "e7a26af6f3ae46b30dde8737f02c07b1505bcc73"}, "metrics": [{"type": "v_measure", "value": 46.22695362087359}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringS2S", "type": "mteb/medrxiv-clustering-s2s", "config": "default", "split": "test", "revision": "35191c8c0dca72d8ff3efcd72aa802307d469663"}, "metrics": [{"type": "v_measure", "value": 44.132372165849425}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB MindSmallReranking", "type": "mteb/mind_small", "config": "default", "split": "test", "revision": "3bdac13927fdc888b903db93b2ffdbd90b295a69"}, "metrics": [{"type": "map", "value": 33.35680810650402}, {"type": "mrr", "value": 34.72625715637218}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NFCorpus", "type": "mteb/nfcorpus", "config": "default", "split": "test", "revision": "ec0fa4fe99da2ff19ca1214b7966684033a58814"}, "metrics": [{"type": "map_at_1", "value": 7.165000000000001}, {"type": "map_at_10", "value": 15.424}, {"type": "map_at_100", "value": 20.28}, {"type": "map_at_1000", "value": 22.065}, {"type": "map_at_3", "value": 11.236}, {"type": "map_at_5", "value": 13.025999999999998}, {"type": "mrr_at_1", "value": 51.702999999999996}, {"type": "mrr_at_10", "value": 59.965}, {"type": "mrr_at_100", "value": 60.667}, {"type": "mrr_at_1000", "value": 60.702999999999996}, {"type": "mrr_at_3", "value": 58.772000000000006}, {"type": "mrr_at_5", "value": 59.267}, {"type": "ndcg_at_1", "value": 49.536}, {"type": "ndcg_at_10", "value": 40.6}, {"type": "ndcg_at_100", "value": 37.848}, {"type": "ndcg_at_1000", "value": 46.657}, {"type": "ndcg_at_3", "value": 46.117999999999995}, {"type": "ndcg_at_5", "value": 43.619}, {"type": "precision_at_1", "value": 51.393}, {"type": "precision_at_10", "value": 30.31}, {"type": "precision_at_100", "value": 9.972}, {"type": "precision_at_1000", "value": 2.329}, {"type": "precision_at_3", "value": 43.137}, {"type": "precision_at_5", "value": 37.585}, {"type": "recall_at_1", "value": 7.165000000000001}, {"type": "recall_at_10", "value": 19.689999999999998}, {"type": "recall_at_100", "value": 39.237}, {"type": "recall_at_1000", "value": 71.417}, {"type": "recall_at_3", "value": 12.247}, {"type": "recall_at_5", "value": 14.902999999999999}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NQ", "type": "mteb/nq", "config": "default", "split": "test", "revision": "b774495ed302d8c44a3a7ea25c90dbce03968f31"}, "metrics": [{"type": "map_at_1", "value": 42.653999999999996}, {"type": "map_at_10", "value": 59.611999999999995}, {"type": "map_at_100", "value": 60.32300000000001}, {"type": "map_at_1000", "value": 60.336}, {"type": "map_at_3", "value": 55.584999999999994}, {"type": "map_at_5", "value": 58.19}, {"type": "mrr_at_1", "value": 47.683}, {"type": "mrr_at_10", "value": 62.06700000000001}, {"type": "mrr_at_100", "value": 62.537}, {"type": "mrr_at_1000", "value": 62.544999999999995}, {"type": "mrr_at_3", "value": 59.178}, {"type": "mrr_at_5", "value": 61.034}, {"type": "ndcg_at_1", "value": 47.654}, {"type": "ndcg_at_10", "value": 67.001}, {"type": "ndcg_at_100", "value": 69.73899999999999}, {"type": "ndcg_at_1000", "value": 69.986}, {"type": "ndcg_at_3", "value": 59.95700000000001}, {"type": "ndcg_at_5", "value": 64.025}, {"type": "precision_at_1", "value": 47.654}, {"type": "precision_at_10", "value": 10.367999999999999}, {"type": "precision_at_100", "value": 1.192}, {"type": "precision_at_1000", "value": 0.121}, {"type": "precision_at_3", "value": 26.651000000000003}, {"type": "precision_at_5", "value": 18.459}, {"type": "recall_at_1", "value": 42.653999999999996}, {"type": "recall_at_10", "value": 86.619}, {"type": "recall_at_100", "value": 98.04899999999999}, {"type": "recall_at_1000", "value": 99.812}, {"type": "recall_at_3", "value": 68.987}, {"type": "recall_at_5", "value": 78.158}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB QuoraRetrieval", "type": "mteb/quora", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 72.538}, {"type": "map_at_10", "value": 86.702}, {"type": "map_at_100", "value": 87.31}, {"type": "map_at_1000", "value": 87.323}, {"type": "map_at_3", "value": 83.87}, {"type": "map_at_5", "value": 85.682}, {"type": "mrr_at_1", "value": 83.31}, {"type": "mrr_at_10", "value": 89.225}, {"type": "mrr_at_100", "value": 89.30399999999999}, {"type": "mrr_at_1000", "value": 89.30399999999999}, {"type": "mrr_at_3", "value": 88.44300000000001}, {"type": "mrr_at_5", "value": 89.005}, {"type": "ndcg_at_1", "value": 83.32000000000001}, {"type": "ndcg_at_10", "value": 90.095}, {"type": "ndcg_at_100", "value": 91.12}, {"type": "ndcg_at_1000", "value": 91.179}, {"type": "ndcg_at_3", "value": 87.606}, {"type": "ndcg_at_5", "value": 89.031}, {"type": "precision_at_1", "value": 83.32000000000001}, {"type": "precision_at_10", "value": 13.641}, {"type": "precision_at_100", "value": 1.541}, {"type": "precision_at_1000", "value": 0.157}, {"type": "precision_at_3", "value": 38.377}, {"type": "precision_at_5", "value": 25.162000000000003}, {"type": "recall_at_1", "value": 72.538}, {"type": "recall_at_10", "value": 96.47200000000001}, {"type": "recall_at_100", "value": 99.785}, {"type": "recall_at_1000", "value": 99.99900000000001}, {"type": "recall_at_3", "value": 89.278}, {"type": "recall_at_5", "value": 93.367}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClustering", "type": "mteb/reddit-clustering", "config": "default", "split": "test", "revision": "24640382cdbf8abc73003fb0fa6d111a705499eb"}, "metrics": [{"type": "v_measure", "value": 73.55219145406065}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClusteringP2P", "type": "mteb/reddit-clustering-p2p", "config": "default", "split": "test", "revision": "282350215ef01743dc01b456c7f5241fa8937f16"}, "metrics": [{"type": "v_measure", "value": 74.13437105242755}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SCIDOCS", "type": "mteb/scidocs", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 6.873}, {"type": "map_at_10", "value": 17.944}, {"type": "map_at_100", "value": 21.171}, {"type": "map_at_1000", "value": 21.528}, {"type": "map_at_3", "value": 12.415}, {"type": "map_at_5", "value": 15.187999999999999}, {"type": "mrr_at_1", "value": 33.800000000000004}, {"type": "mrr_at_10", "value": 46.455}, {"type": "mrr_at_100", "value": 47.378}, {"type": "mrr_at_1000", "value": 47.394999999999996}, {"type": "mrr_at_3", "value": 42.367}, {"type": "mrr_at_5", "value": 44.972}, {"type": "ndcg_at_1", "value": 33.800000000000004}, {"type": "ndcg_at_10", "value": 28.907}, {"type": "ndcg_at_100", "value": 39.695}, {"type": "ndcg_at_1000", "value": 44.582}, {"type": "ndcg_at_3", "value": 26.949}, {"type": "ndcg_at_5", "value": 23.988}, {"type": "precision_at_1", "value": 33.800000000000004}, {"type": "precision_at_10", "value": 15.079999999999998}, {"type": "precision_at_100", "value": 3.056}, {"type": "precision_at_1000", "value": 0.42100000000000004}, {"type": "precision_at_3", "value": 25.167}, {"type": "precision_at_5", "value": 21.26}, {"type": "recall_at_1", "value": 6.873}, {"type": "recall_at_10", "value": 30.568}, {"type": "recall_at_100", "value": 62.062}, {"type": "recall_at_1000", "value": 85.37700000000001}, {"type": "recall_at_3", "value": 15.312999999999999}, {"type": "recall_at_5", "value": 21.575}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB SICK-R", "type": "mteb/sickr-sts", "config": "default", "split": "test", "revision": "a6ea5a8cab320b040a23452cc28066d9beae2cee"}, "metrics": [{"type": "cos_sim_pearson", "value": 82.37009118256057}, {"type": "cos_sim_spearman", "value": 79.27986395671529}, {"type": "euclidean_pearson", "value": 79.18037715442115}, {"type": "euclidean_spearman", "value": 79.28004791561621}, {"type": "manhattan_pearson", "value": 79.34062972800541}, {"type": "manhattan_spearman", "value": 79.43106695543402}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS12", "type": "mteb/sts12-sts", "config": "default", "split": "test", "revision": "a0d554a64d88156834ff5ae9920b964011b16384"}, "metrics": [{"type": "cos_sim_pearson", "value": 87.48474767383833}, {"type": "cos_sim_spearman", "value": 79.54505388752513}, {"type": "euclidean_pearson", "value": 83.43282704179565}, {"type": "euclidean_spearman", "value": 79.54579919925405}, {"type": "manhattan_pearson", "value": 83.77564492427952}, {"type": "manhattan_spearman", "value": 79.84558396989286}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS13", "type": "mteb/sts13-sts", "config": "default", "split": "test", "revision": "7e90230a92c190f1bf69ae9002b8cea547a64cca"}, "metrics": [{"type": "cos_sim_pearson", "value": 88.803698035802}, {"type": "cos_sim_spearman", "value": 88.83451367754881}, {"type": "euclidean_pearson", "value": 88.28939285711628}, {"type": "euclidean_spearman", "value": 88.83528996073112}, {"type": "manhattan_pearson", "value": 88.28017412671795}, {"type": "manhattan_spearman", "value": 88.9228828016344}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS14", "type": "mteb/sts14-sts", "config": "default", "split": "test", "revision": "6031580fec1f6af667f0bd2da0a551cf4f0b2375"}, "metrics": [{"type": "cos_sim_pearson", "value": 85.27469288153428}, {"type": "cos_sim_spearman", "value": 83.87477064876288}, {"type": "euclidean_pearson", "value": 84.2601737035379}, {"type": "euclidean_spearman", "value": 83.87431082479074}, {"type": "manhattan_pearson", "value": 84.3621547772745}, {"type": "manhattan_spearman", "value": 84.12094375000423}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS15", "type": "mteb/sts15-sts", "config": "default", "split": "test", "revision": "ae752c7c21bf194d8b67fd573edf7ae58183cbe3"}, "metrics": [{"type": "cos_sim_pearson", "value": 88.12749863201587}, {"type": "cos_sim_spearman", "value": 88.54287568368565}, {"type": "euclidean_pearson", "value": 87.90429700607999}, {"type": "euclidean_spearman", "value": 88.5437689576261}, {"type": "manhattan_pearson", "value": 88.19276653356833}, {"type": "manhattan_spearman", "value": 88.99995393814679}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS16", "type": "mteb/sts16-sts", "config": "default", "split": "test", "revision": "4d8694f8f0e0100860b497b999b3dbed754a0513"}, "metrics": [{"type": "cos_sim_pearson", "value": 85.68398747560902}, {"type": "cos_sim_spearman", "value": 86.48815303460574}, {"type": "euclidean_pearson", "value": 85.52356631237954}, {"type": "euclidean_spearman", "value": 86.486391949551}, {"type": "manhattan_pearson", "value": 85.67267981761788}, {"type": "manhattan_spearman", "value": 86.7073696332485}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS17 (en-en)", "type": "mteb/sts17-crosslingual-sts", "config": "en-en", "split": "test", "revision": "af5e6fb845001ecf41f4c1e033ce921939a2a68d"}, "metrics": [{"type": "cos_sim_pearson", "value": 88.9057107443124}, {"type": "cos_sim_spearman", "value": 88.7312168757697}, {"type": "euclidean_pearson", "value": 88.72810439714794}, {"type": "euclidean_spearman", "value": 88.71976185854771}, {"type": "manhattan_pearson", "value": 88.50433745949111}, {"type": "manhattan_spearman", "value": 88.51726175544195}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (en)", "type": "mteb/sts22-crosslingual-sts", "config": "en", "split": "test", "revision": "eea2b4fe26a775864c896887d910b76a8098ad3f"}, "metrics": [{"type": "cos_sim_pearson", "value": 67.59391795109886}, {"type": "cos_sim_spearman", "value": 66.87613008631367}, {"type": "euclidean_pearson", "value": 69.23198488262217}, {"type": "euclidean_spearman", "value": 66.85427723013692}, {"type": "manhattan_pearson", "value": 69.50730124841084}, {"type": "manhattan_spearman", "value": 67.10404669820792}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSBenchmark", "type": "mteb/stsbenchmark-sts", "config": "default", "split": "test", "revision": "b0fddb56ed78048fa8b90373c8a3cfc37b684831"}, "metrics": [{"type": "cos_sim_pearson", "value": 87.0820605344619}, {"type": "cos_sim_spearman", "value": 86.8518089863434}, {"type": "euclidean_pearson", "value": 86.31087134689284}, {"type": "euclidean_spearman", "value": 86.8518520517941}, {"type": "manhattan_pearson", "value": 86.47203796160612}, {"type": "manhattan_spearman", "value": 87.1080149734421}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB SciDocsRR", "type": "mteb/scidocs-reranking", "config": "default", "split": "test", "revision": "d3c5e1fc0b855ab6097bf1cda04dd73947d7caab"}, "metrics": [{"type": "map", "value": 89.09255369305481}, {"type": "mrr", "value": 97.10323445617563}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SciFact", "type": "mteb/scifact", "config": "default", "split": "test", "revision": "0228b52cf27578f30900b9e5271d331663a030d7"}, "metrics": [{"type": "map_at_1", "value": 61.260999999999996}, {"type": "map_at_10", "value": 74.043}, {"type": "map_at_100", "value": 74.37700000000001}, {"type": "map_at_1000", "value": 74.384}, {"type": "map_at_3", "value": 71.222}, {"type": "map_at_5", "value": 72.875}, {"type": "mrr_at_1", "value": 64.333}, {"type": "mrr_at_10", "value": 74.984}, {"type": "mrr_at_100", "value": 75.247}, {"type": "mrr_at_1000", "value": 75.25500000000001}, {"type": "mrr_at_3", "value": 73.167}, {"type": "mrr_at_5", "value": 74.35000000000001}, {"type": "ndcg_at_1", "value": 64.333}, {"type": "ndcg_at_10", "value": 79.06}, {"type": "ndcg_at_100", "value": 80.416}, {"type": "ndcg_at_1000", "value": 80.55600000000001}, {"type": "ndcg_at_3", "value": 74.753}, {"type": "ndcg_at_5", "value": 76.97500000000001}, {"type": "precision_at_1", "value": 64.333}, {"type": "precision_at_10", "value": 10.567}, {"type": "precision_at_100", "value": 1.1199999999999999}, {"type": "precision_at_1000", "value": 0.11299999999999999}, {"type": "precision_at_3", "value": 29.889}, {"type": "precision_at_5", "value": 19.533}, {"type": "recall_at_1", "value": 61.260999999999996}, {"type": "recall_at_10", "value": 93.167}, {"type": "recall_at_100", "value": 99.0}, {"type": "recall_at_1000", "value": 100.0}, {"type": "recall_at_3", "value": 81.667}, {"type": "recall_at_5", "value": 87.394}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB SprintDuplicateQuestions", "type": "mteb/sprintduplicatequestions-pairclassification", "config": "default", "split": "test", "revision": "d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46"}, "metrics": [{"type": "cos_sim_accuracy", "value": 99.71980198019801}, {"type": "cos_sim_ap", "value": 92.81616007802704}, {"type": "cos_sim_f1", "value": 85.17548454688318}, {"type": "cos_sim_precision", "value": 89.43894389438944}, {"type": "cos_sim_recall", "value": 81.3}, {"type": "dot_accuracy", "value": 99.71980198019801}, {"type": "dot_ap", "value": 92.81398760591358}, {"type": "dot_f1", "value": 85.17548454688318}, {"type": "dot_precision", "value": 89.43894389438944}, {"type": "dot_recall", "value": 81.3}, {"type": "euclidean_accuracy", "value": 99.71980198019801}, {"type": "euclidean_ap", "value": 92.81560637245072}, {"type": "euclidean_f1", "value": 85.17548454688318}, {"type": "euclidean_precision", "value": 89.43894389438944}, {"type": "euclidean_recall", "value": 81.3}, {"type": "manhattan_accuracy", "value": 99.73069306930694}, {"type": "manhattan_ap", "value": 93.14005487480794}, {"type": "manhattan_f1", "value": 85.56263269639068}, {"type": "manhattan_precision", "value": 91.17647058823529}, {"type": "manhattan_recall", "value": 80.60000000000001}, {"type": "max_accuracy", "value": 99.73069306930694}, {"type": "max_ap", "value": 93.14005487480794}, {"type": "max_f1", "value": 85.56263269639068}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClustering", "type": "mteb/stackexchange-clustering", "config": "default", "split": "test", "revision": "6cbc1f7b2bc0622f2e39d2c77fa502909748c259"}, "metrics": [{"type": "v_measure", "value": 79.86443362395185}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClusteringP2P", "type": "mteb/stackexchange-clustering-p2p", "config": "default", "split": "test", "revision": "815ca46b2622cec33ccafc3735d572c266efdb44"}, "metrics": [{"type": "v_measure", "value": 49.40897096662564}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB StackOverflowDupQuestions", "type": "mteb/stackoverflowdupquestions-reranking", "config": "default", "split": "test", "revision": "e185fbe320c72810689fc5848eb6114e1ef5ec69"}, "metrics": [{"type": "map", "value": 55.66040806627947}, {"type": "mrr", "value": 56.58670475766064}]}, {"task": {"type": "Summarization"}, "dataset": {"name": "MTEB SummEval", "type": "mteb/summeval", "config": "default", "split": "test", "revision": "cda12ad7615edc362dbf25a00fdd61d3b1eaf93c"}, "metrics": [{"type": "cos_sim_pearson", "value": 31.51015090598575}, {"type": "cos_sim_spearman", "value": 31.35016454939226}, {"type": "dot_pearson", "value": 31.5150068731}, {"type": "dot_spearman", "value": 31.34790869023487}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB TRECCOVID", "type": "mteb/trec-covid", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 0.254}, {"type": "map_at_10", "value": 2.064}, {"type": "map_at_100", "value": 12.909}, {"type": "map_at_1000", "value": 31.761}, {"type": "map_at_3", "value": 0.738}, {"type": "map_at_5", "value": 1.155}, {"type": "mrr_at_1", "value": 96.0}, {"type": "mrr_at_10", "value": 98.0}, {"type": "mrr_at_100", "value": 98.0}, {"type": "mrr_at_1000", "value": 98.0}, {"type": "mrr_at_3", "value": 98.0}, {"type": "mrr_at_5", "value": 98.0}, {"type": "ndcg_at_1", "value": 93.0}, {"type": "ndcg_at_10", "value": 82.258}, {"type": "ndcg_at_100", "value": 64.34}, {"type": "ndcg_at_1000", "value": 57.912}, {"type": "ndcg_at_3", "value": 90.827}, {"type": "ndcg_at_5", "value": 86.79}, {"type": "precision_at_1", "value": 96.0}, {"type": "precision_at_10", "value": 84.8}, {"type": "precision_at_100", "value": 66.0}, {"type": "precision_at_1000", "value": 25.356}, {"type": "precision_at_3", "value": 94.667}, {"type": "precision_at_5", "value": 90.4}, {"type": "recall_at_1", "value": 0.254}, {"type": "recall_at_10", "value": 2.1950000000000003}, {"type": "recall_at_100", "value": 16.088}, {"type": "recall_at_1000", "value": 54.559000000000005}, {"type": "recall_at_3", "value": 0.75}, {"type": "recall_at_5", "value": 1.191}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB Touche2020", "type": "mteb/touche2020", "config": "default", "split": "test", "revision": "a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f"}, "metrics": [{"type": "map_at_1", "value": 2.976}, {"type": "map_at_10", "value": 11.389000000000001}, {"type": "map_at_100", "value": 18.429000000000002}, {"type": "map_at_1000", "value": 20.113}, {"type": "map_at_3", "value": 6.483}, {"type": "map_at_5", "value": 8.770999999999999}, {"type": "mrr_at_1", "value": 40.816}, {"type": "mrr_at_10", "value": 58.118}, {"type": "mrr_at_100", "value": 58.489999999999995}, {"type": "mrr_at_1000", "value": 58.489999999999995}, {"type": "mrr_at_3", "value": 53.061}, {"type": "mrr_at_5", "value": 57.041}, {"type": "ndcg_at_1", "value": 40.816}, {"type": "ndcg_at_10", "value": 30.567}, {"type": "ndcg_at_100", "value": 42.44}, {"type": "ndcg_at_1000", "value": 53.480000000000004}, {"type": "ndcg_at_3", "value": 36.016}, {"type": "ndcg_at_5", "value": 34.257}, {"type": "precision_at_1", "value": 42.857}, {"type": "precision_at_10", "value": 25.714}, {"type": "precision_at_100", "value": 8.429}, {"type": "precision_at_1000", "value": 1.5939999999999999}, {"type": "precision_at_3", "value": 36.735}, {"type": "precision_at_5", "value": 33.878}, {"type": "recall_at_1", "value": 2.976}, {"type": "recall_at_10", "value": 17.854999999999997}, {"type": "recall_at_100", "value": 51.833}, {"type": "recall_at_1000", "value": 86.223}, {"type": "recall_at_3", "value": 7.887}, {"type": "recall_at_5", "value": 12.026}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ToxicConversationsClassification", "type": "mteb/toxic_conversations_50k", "config": "default", "split": "test", "revision": "d7c0de2777da35d6aae2200a62c6e0e5af397c4c"}, "metrics": [{"type": "accuracy", "value": 85.1174}, {"type": "ap", "value": 30.169441069345748}, {"type": "f1", "value": 69.79254701873245}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB TweetSentimentExtractionClassification", "type": "mteb/tweet_sentiment_extraction", "config": "default", "split": "test", "revision": "d604517c81ca91fe16a244d1248fc021f9ecee7a"}, "metrics": [{"type": "accuracy", "value": 72.58347481607245}, {"type": "f1", "value": 72.74877295564937}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB TwentyNewsgroupsClustering", "type": "mteb/twentynewsgroups-clustering", "config": "default", "split": "test", "revision": "6125ec4e24fa026cec8a478383ee943acfbd5449"}, "metrics": [{"type": "v_measure", "value": 53.90586138221305}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterSemEval2015", "type": "mteb/twittersemeval2015-pairclassification", "config": "default", "split": "test", "revision": "70970daeab8776df92f5ea462b6173c0b46fd2d1"}, "metrics": [{"type": "cos_sim_accuracy", "value": 87.35769207844072}, {"type": "cos_sim_ap", "value": 77.9645072410354}, {"type": "cos_sim_f1", "value": 71.32352941176471}, {"type": "cos_sim_precision", "value": 66.5903890160183}, {"type": "cos_sim_recall", "value": 76.78100263852242}, {"type": "dot_accuracy", "value": 87.37557370209214}, {"type": "dot_ap", "value": 77.96250046429908}, {"type": "dot_f1", "value": 71.28932757557064}, {"type": "dot_precision", "value": 66.95249130938586}, {"type": "dot_recall", "value": 76.22691292875989}, {"type": "euclidean_accuracy", "value": 87.35173153722357}, {"type": "euclidean_ap", "value": 77.96520460741593}, {"type": "euclidean_f1", "value": 71.32470733210104}, {"type": "euclidean_precision", "value": 66.91329479768785}, {"type": "euclidean_recall", "value": 76.35883905013192}, {"type": "manhattan_accuracy", "value": 87.25636287774931}, {"type": "manhattan_ap", "value": 77.77752485611796}, {"type": "manhattan_f1", "value": 71.18148599269183}, {"type": "manhattan_precision", "value": 66.10859728506787}, {"type": "manhattan_recall", "value": 77.0976253298153}, {"type": "max_accuracy", "value": 87.37557370209214}, {"type": "max_ap", "value": 77.96520460741593}, {"type": "max_f1", "value": 71.32470733210104}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterURLCorpus", "type": "mteb/twitterurlcorpus-pairclassification", "config": "default", "split": "test", "revision": "8b6510b0b1fa4e4c4f879467980e9be563ec1cdf"}, "metrics": [{"type": "cos_sim_accuracy", "value": 89.38176737687739}, {"type": "cos_sim_ap", "value": 86.58811861657401}, {"type": "cos_sim_f1", "value": 79.09430644097604}, {"type": "cos_sim_precision", "value": 75.45085977911366}, {"type": "cos_sim_recall", "value": 83.10748383122882}, {"type": "dot_accuracy", "value": 89.38370784336554}, {"type": "dot_ap", "value": 86.58840606004333}, {"type": "dot_f1", "value": 79.10179860068133}, {"type": "dot_precision", "value": 75.44546153308643}, {"type": "dot_recall", "value": 83.13058207576223}, {"type": "euclidean_accuracy", "value": 89.38564830985369}, {"type": "euclidean_ap", "value": 86.58820721061164}, {"type": "euclidean_f1", "value": 79.09070942235888}, {"type": "euclidean_precision", "value": 75.38729937194697}, {"type": "euclidean_recall", "value": 83.17677856482906}, {"type": "manhattan_accuracy", "value": 89.40699344122326}, {"type": "manhattan_ap", "value": 86.60631843011362}, {"type": "manhattan_f1", "value": 79.14949970570925}, {"type": "manhattan_precision", "value": 75.78191039729502}, {"type": "manhattan_recall", "value": 82.83030489682784}, {"type": "max_accuracy", "value": 89.40699344122326}, {"type": "max_ap", "value": 86.60631843011362}, {"type": "max_f1", "value": 79.14949970570925}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB AFQMC", "type": "C-MTEB/AFQMC", "config": "default", "split": "validation", "revision": "b44c3b011063adb25877c13823db83bb193913c4"}, "metrics": [{"type": "cos_sim_pearson", "value": 65.58442135663871}, {"type": "cos_sim_spearman", "value": 72.2538631361313}, {"type": "euclidean_pearson", "value": 70.97255486607429}, {"type": "euclidean_spearman", "value": 72.25374250228647}, {"type": "manhattan_pearson", "value": 70.83250199989911}, {"type": "manhattan_spearman", "value": 72.14819496536272}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB ATEC", "type": "C-MTEB/ATEC", "config": "default", "split": "test", "revision": "0f319b1142f28d00e055a6770f3f726ae9b7d865"}, "metrics": [{"type": "cos_sim_pearson", "value": 59.99478404929932}, {"type": "cos_sim_spearman", "value": 62.61836216999812}, {"type": "euclidean_pearson", "value": 66.86429811933593}, {"type": "euclidean_spearman", "value": 62.6183520374191}, {"type": "manhattan_pearson", "value": 66.8063778911633}, {"type": "manhattan_spearman", "value": 62.569607573241115}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification (zh)", "type": "mteb/amazon_reviews_multi", "config": "zh", "split": "test", "revision": "1399c76144fd37290681b995c656ef9b2e06e26d"}, "metrics": [{"type": "accuracy", "value": 53.98400000000001}, {"type": "f1", "value": 51.21447361350723}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB BQ", "type": "C-MTEB/BQ", "config": "default", "split": "test", "revision": "e3dda5e115e487b39ec7e618c0c6a29137052a55"}, "metrics": [{"type": "cos_sim_pearson", "value": 79.11941660686553}, {"type": "cos_sim_spearman", "value": 81.25029594540435}, {"type": "euclidean_pearson", "value": 82.06973504238826}, {"type": "euclidean_spearman", "value": 81.2501989488524}, {"type": "manhattan_pearson", "value": 82.10094630392753}, {"type": "manhattan_spearman", "value": 81.27987244392389}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB CLSClusteringP2P", "type": "C-MTEB/CLSClusteringP2P", "config": "default", "split": "test", "revision": "4b6227591c6c1a73bc76b1055f3b7f3588e72476"}, "metrics": [{"type": "v_measure", "value": 47.07270168705156}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB CLSClusteringS2S", "type": "C-MTEB/CLSClusteringS2S", "config": "default", "split": "test", "revision": "e458b3f5414b62b7f9f83499ac1f5497ae2e869f"}, "metrics": [{"type": "v_measure", "value": 45.98511703185043}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB CMedQAv1", "type": "C-MTEB/CMedQAv1-reranking", "config": "default", "split": "test", "revision": "8d7f1e942507dac42dc58017c1a001c3717da7df"}, "metrics": [{"type": "map", "value": 88.19895157194931}, {"type": "mrr", "value": 90.21424603174603}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB CMedQAv2", "type": "C-MTEB/CMedQAv2-reranking", "config": "default", "split": "test", "revision": "23d186750531a14a0357ca22cd92d712fd512ea0"}, "metrics": [{"type": "map", "value": 88.03317320980119}, {"type": "mrr", "value": 89.9461507936508}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CmedqaRetrieval", "type": "C-MTEB/CmedqaRetrieval", "config": "default", "split": "dev", "revision": "cd540c506dae1cf9e9a59c3e06f42030d54e7301"}, "metrics": [{"type": "map_at_1", "value": 29.037000000000003}, {"type": "map_at_10", "value": 42.001}, {"type": "map_at_100", "value": 43.773}, {"type": "map_at_1000", "value": 43.878}, {"type": "map_at_3", "value": 37.637}, {"type": "map_at_5", "value": 40.034}, {"type": "mrr_at_1", "value": 43.136}, {"type": "mrr_at_10", "value": 51.158}, {"type": "mrr_at_100", "value": 52.083}, {"type": "mrr_at_1000", "value": 52.12}, {"type": "mrr_at_3", "value": 48.733}, {"type": "mrr_at_5", "value": 50.025}, {"type": "ndcg_at_1", "value": 43.136}, {"type": "ndcg_at_10", "value": 48.685}, {"type": "ndcg_at_100", "value": 55.513}, {"type": "ndcg_at_1000", "value": 57.242000000000004}, {"type": "ndcg_at_3", "value": 43.329}, {"type": "ndcg_at_5", "value": 45.438}, {"type": "precision_at_1", "value": 43.136}, {"type": "precision_at_10", "value": 10.56}, {"type": "precision_at_100", "value": 1.6129999999999998}, {"type": "precision_at_1000", "value": 0.184}, {"type": "precision_at_3", "value": 24.064}, {"type": "precision_at_5", "value": 17.269000000000002}, {"type": "recall_at_1", "value": 29.037000000000003}, {"type": "recall_at_10", "value": 59.245000000000005}, {"type": "recall_at_100", "value": 87.355}, {"type": "recall_at_1000", "value": 98.74000000000001}, {"type": "recall_at_3", "value": 42.99}, {"type": "recall_at_5", "value": 49.681999999999995}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB Cmnli", "type": "C-MTEB/CMNLI", "config": "default", "split": "validation", "revision": "41bc36f332156f7adc9e38f53777c959b2ae9766"}, "metrics": [{"type": "cos_sim_accuracy", "value": 82.68190018039687}, {"type": "cos_sim_ap", "value": 90.18017125327886}, {"type": "cos_sim_f1", "value": 83.64080906868193}, {"type": "cos_sim_precision", "value": 79.7076890489303}, {"type": "cos_sim_recall", "value": 87.98223053542202}, {"type": "dot_accuracy", "value": 82.68190018039687}, {"type": "dot_ap", "value": 90.18782350103646}, {"type": "dot_f1", "value": 83.64242087729039}, {"type": "dot_precision", "value": 79.65313028764805}, {"type": "dot_recall", "value": 88.05237315875614}, {"type": "euclidean_accuracy", "value": 82.68190018039687}, {"type": "euclidean_ap", "value": 90.1801957900632}, {"type": "euclidean_f1", "value": 83.63636363636364}, {"type": "euclidean_precision", "value": 79.52772506852203}, {"type": "euclidean_recall", "value": 88.19265840542437}, {"type": "manhattan_accuracy", "value": 82.14070956103427}, {"type": "manhattan_ap", "value": 89.96178420101427}, {"type": "manhattan_f1", "value": 83.21087838578791}, {"type": "manhattan_precision", "value": 78.35605121850475}, {"type": "manhattan_recall", "value": 88.70703764320785}, {"type": "max_accuracy", "value": 82.68190018039687}, {"type": "max_ap", "value": 90.18782350103646}, {"type": "max_f1", "value": 83.64242087729039}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CovidRetrieval", "type": "C-MTEB/CovidRetrieval", "config": "default", "split": "dev", "revision": "1271c7809071a13532e05f25fb53511ffce77117"}, "metrics": [{"type": "map_at_1", "value": 72.234}, {"type": "map_at_10", "value": 80.10000000000001}, {"type": "map_at_100", "value": 80.36}, {"type": "map_at_1000", "value": 80.363}, {"type": "map_at_3", "value": 78.315}, {"type": "map_at_5", "value": 79.607}, {"type": "mrr_at_1", "value": 72.392}, {"type": "mrr_at_10", "value": 80.117}, {"type": "mrr_at_100", "value": 80.36999999999999}, {"type": "mrr_at_1000", "value": 80.373}, {"type": "mrr_at_3", "value": 78.469}, {"type": "mrr_at_5", "value": 79.633}, {"type": "ndcg_at_1", "value": 72.392}, {"type": "ndcg_at_10", "value": 83.651}, {"type": "ndcg_at_100", "value": 84.749}, {"type": "ndcg_at_1000", "value": 84.83000000000001}, {"type": "ndcg_at_3", "value": 80.253}, {"type": "ndcg_at_5", "value": 82.485}, {"type": "precision_at_1", "value": 72.392}, {"type": "precision_at_10", "value": 9.557}, {"type": "precision_at_100", "value": 1.004}, {"type": "precision_at_1000", "value": 0.101}, {"type": "precision_at_3", "value": 28.732000000000003}, {"type": "precision_at_5", "value": 18.377}, {"type": "recall_at_1", "value": 72.234}, {"type": "recall_at_10", "value": 94.573}, {"type": "recall_at_100", "value": 99.368}, {"type": "recall_at_1000", "value": 100.0}, {"type": "recall_at_3", "value": 85.669}, {"type": "recall_at_5", "value": 91.01700000000001}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB DuRetrieval", "type": "C-MTEB/DuRetrieval", "config": "default", "split": "dev", "revision": "a1a333e290fe30b10f3f56498e3a0d911a693ced"}, "metrics": [{"type": "map_at_1", "value": 26.173999999999996}, {"type": "map_at_10", "value": 80.04}, {"type": "map_at_100", "value": 82.94500000000001}, {"type": "map_at_1000", "value": 82.98100000000001}, {"type": "map_at_3", "value": 55.562999999999995}, {"type": "map_at_5", "value": 69.89800000000001}, {"type": "mrr_at_1", "value": 89.5}, {"type": "mrr_at_10", "value": 92.996}, {"type": "mrr_at_100", "value": 93.06400000000001}, {"type": "mrr_at_1000", "value": 93.065}, {"type": "mrr_at_3", "value": 92.658}, {"type": "mrr_at_5", "value": 92.84599999999999}, {"type": "ndcg_at_1", "value": 89.5}, {"type": "ndcg_at_10", "value": 87.443}, {"type": "ndcg_at_100", "value": 90.253}, {"type": "ndcg_at_1000", "value": 90.549}, {"type": "ndcg_at_3", "value": 85.874}, {"type": "ndcg_at_5", "value": 84.842}, {"type": "precision_at_1", "value": 89.5}, {"type": "precision_at_10", "value": 41.805}, {"type": "precision_at_100", "value": 4.827}, {"type": "precision_at_1000", "value": 0.49}, {"type": "precision_at_3", "value": 76.85}, {"type": "precision_at_5", "value": 64.8}, {"type": "recall_at_1", "value": 26.173999999999996}, {"type": "recall_at_10", "value": 89.101}, {"type": "recall_at_100", "value": 98.08099999999999}, {"type": "recall_at_1000", "value": 99.529}, {"type": "recall_at_3", "value": 57.902}, {"type": "recall_at_5", "value": 74.602}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB EcomRetrieval", "type": "C-MTEB/EcomRetrieval", "config": "default", "split": "dev", "revision": "687de13dc7294d6fd9be10c6945f9e8fec8166b9"}, "metrics": [{"type": "map_at_1", "value": 56.10000000000001}, {"type": "map_at_10", "value": 66.15299999999999}, {"type": "map_at_100", "value": 66.625}, {"type": "map_at_1000", "value": 66.636}, {"type": "map_at_3", "value": 63.632999999999996}, {"type": "map_at_5", "value": 65.293}, {"type": "mrr_at_1", "value": 56.10000000000001}, {"type": "mrr_at_10", "value": 66.15299999999999}, {"type": "mrr_at_100", "value": 66.625}, {"type": "mrr_at_1000", "value": 66.636}, {"type": "mrr_at_3", "value": 63.632999999999996}, {"type": "mrr_at_5", "value": 65.293}, {"type": "ndcg_at_1", "value": 56.10000000000001}, {"type": "ndcg_at_10", "value": 71.146}, {"type": "ndcg_at_100", "value": 73.27799999999999}, {"type": "ndcg_at_1000", "value": 73.529}, {"type": "ndcg_at_3", "value": 66.09}, {"type": "ndcg_at_5", "value": 69.08999999999999}, {"type": "precision_at_1", "value": 56.10000000000001}, {"type": "precision_at_10", "value": 8.68}, {"type": "precision_at_100", "value": 0.964}, {"type": "precision_at_1000", "value": 0.098}, {"type": "precision_at_3", "value": 24.4}, {"type": "precision_at_5", "value": 16.1}, {"type": "recall_at_1", "value": 56.10000000000001}, {"type": "recall_at_10", "value": 86.8}, {"type": "recall_at_100", "value": 96.39999999999999}, {"type": "recall_at_1000", "value": 98.3}, {"type": "recall_at_3", "value": 73.2}, {"type": "recall_at_5", "value": 80.5}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB IFlyTek", "type": "C-MTEB/IFlyTek-classification", "config": "default", "split": "validation", "revision": "421605374b29664c5fc098418fe20ada9bd55f8a"}, "metrics": [{"type": "accuracy", "value": 54.52096960369373}, {"type": "f1", "value": 40.930845295808695}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB JDReview", "type": "C-MTEB/JDReview-classification", "config": "default", "split": "test", "revision": "b7c64bd89eb87f8ded463478346f76731f07bf8b"}, "metrics": [{"type": "accuracy", "value": 86.51031894934334}, {"type": "ap", "value": 55.9516014323483}, {"type": "f1", "value": 81.54813679326381}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB LCQMC", "type": "C-MTEB/LCQMC", "config": "default", "split": "test", "revision": "17f9b096f80380fce5ed12a9be8be7784b337daf"}, "metrics": [{"type": "cos_sim_pearson", "value": 69.67437838574276}, {"type": "cos_sim_spearman", "value": 73.81314174653045}, {"type": "euclidean_pearson", "value": 72.63430276680275}, {"type": "euclidean_spearman", "value": 73.81358736777001}, {"type": "manhattan_pearson", "value": 72.58743833842829}, {"type": "manhattan_spearman", "value": 73.7590419009179}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB MMarcoReranking", "type": "C-MTEB/Mmarco-reranking", "config": "default", "split": "dev", "revision": "None"}, "metrics": [{"type": "map", "value": 31.648613483640254}, {"type": "mrr", "value": 30.37420634920635}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MMarcoRetrieval", "type": "C-MTEB/MMarcoRetrieval", "config": "default", "split": "dev", "revision": "539bbde593d947e2a124ba72651aafc09eb33fc2"}, "metrics": [{"type": "map_at_1", "value": 73.28099999999999}, {"type": "map_at_10", "value": 81.977}, {"type": "map_at_100", "value": 82.222}, {"type": "map_at_1000", "value": 82.22699999999999}, {"type": "map_at_3", "value": 80.441}, {"type": "map_at_5", "value": 81.46600000000001}, {"type": "mrr_at_1", "value": 75.673}, {"type": "mrr_at_10", "value": 82.41000000000001}, {"type": "mrr_at_100", "value": 82.616}, {"type": "mrr_at_1000", "value": 82.621}, {"type": "mrr_at_3", "value": 81.094}, {"type": "mrr_at_5", "value": 81.962}, {"type": "ndcg_at_1", "value": 75.673}, {"type": "ndcg_at_10", "value": 85.15599999999999}, {"type": "ndcg_at_100", "value": 86.151}, {"type": "ndcg_at_1000", "value": 86.26899999999999}, {"type": "ndcg_at_3", "value": 82.304}, {"type": "ndcg_at_5", "value": 84.009}, {"type": "precision_at_1", "value": 75.673}, {"type": "precision_at_10", "value": 10.042}, {"type": "precision_at_100", "value": 1.052}, {"type": "precision_at_1000", "value": 0.106}, {"type": "precision_at_3", "value": 30.673000000000002}, {"type": "precision_at_5", "value": 19.326999999999998}, {"type": "recall_at_1", "value": 73.28099999999999}, {"type": "recall_at_10", "value": 94.446}, {"type": "recall_at_100", "value": 98.737}, {"type": "recall_at_1000", "value": 99.649}, {"type": "recall_at_3", "value": 86.984}, {"type": "recall_at_5", "value": 91.024}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (zh-CN)", "type": "mteb/amazon_massive_intent", "config": "zh-CN", "split": "test", "revision": "31efe3c427b0bae9c22cbb560b8f15491cc6bed7"}, "metrics": [{"type": "accuracy", "value": 81.08607935440484}, {"type": "f1", "value": 78.24879986066307}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (zh-CN)", "type": "mteb/amazon_massive_scenario", "config": "zh-CN", "split": "test", "revision": "7d571f92784cd94a019292a1f45445077d0ef634"}, "metrics": [{"type": "accuracy", "value": 86.05917955615332}, {"type": "f1", "value": 85.05279279434997}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MedicalRetrieval", "type": "C-MTEB/MedicalRetrieval", "config": "default", "split": "dev", "revision": "2039188fb5800a9803ba5048df7b76e6fb151fc6"}, "metrics": [{"type": "map_at_1", "value": 56.2}, {"type": "map_at_10", "value": 62.57899999999999}, {"type": "map_at_100", "value": 63.154999999999994}, {"type": "map_at_1000", "value": 63.193}, {"type": "map_at_3", "value": 61.217}, {"type": "map_at_5", "value": 62.012}, {"type": "mrr_at_1", "value": 56.3}, {"type": "mrr_at_10", "value": 62.629000000000005}, {"type": "mrr_at_100", "value": 63.205999999999996}, {"type": "mrr_at_1000", "value": 63.244}, {"type": "mrr_at_3", "value": 61.267}, {"type": "mrr_at_5", "value": 62.062}, {"type": "ndcg_at_1", "value": 56.2}, {"type": "ndcg_at_10", "value": 65.592}, {"type": "ndcg_at_100", "value": 68.657}, {"type": "ndcg_at_1000", "value": 69.671}, {"type": "ndcg_at_3", "value": 62.808}, {"type": "ndcg_at_5", "value": 64.24499999999999}, {"type": "precision_at_1", "value": 56.2}, {"type": "precision_at_10", "value": 7.5}, {"type": "precision_at_100", "value": 0.899}, {"type": "precision_at_1000", "value": 0.098}, {"type": "precision_at_3", "value": 22.467000000000002}, {"type": "precision_at_5", "value": 14.180000000000001}, {"type": "recall_at_1", "value": 56.2}, {"type": "recall_at_10", "value": 75.0}, {"type": "recall_at_100", "value": 89.9}, {"type": "recall_at_1000", "value": 97.89999999999999}, {"type": "recall_at_3", "value": 67.4}, {"type": "recall_at_5", "value": 70.89999999999999}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MultilingualSentiment", "type": "C-MTEB/MultilingualSentiment-classification", "config": "default", "split": "validation", "revision": "46958b007a63fdbf239b7672c25d0bea67b5ea1a"}, "metrics": [{"type": "accuracy", "value": 76.87666666666667}, {"type": "f1", "value": 76.7317686219665}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB Ocnli", "type": "C-MTEB/OCNLI", "config": "default", "split": "validation", "revision": "66e76a618a34d6d565d5538088562851e6daa7ec"}, "metrics": [{"type": "cos_sim_accuracy", "value": 79.64266377910124}, {"type": "cos_sim_ap", "value": 84.78274442344829}, {"type": "cos_sim_f1", "value": 81.16947472745292}, {"type": "cos_sim_precision", "value": 76.47058823529412}, {"type": "cos_sim_recall", "value": 86.48363252375924}, {"type": "dot_accuracy", "value": 79.64266377910124}, {"type": "dot_ap", "value": 84.7851404063692}, {"type": "dot_f1", "value": 81.16947472745292}, {"type": "dot_precision", "value": 76.47058823529412}, {"type": "dot_recall", "value": 86.48363252375924}, {"type": "euclidean_accuracy", "value": 79.64266377910124}, {"type": "euclidean_ap", "value": 84.78068373762378}, {"type": "euclidean_f1", "value": 81.14794656110837}, {"type": "euclidean_precision", "value": 76.35009310986965}, {"type": "euclidean_recall", "value": 86.58922914466737}, {"type": "manhattan_accuracy", "value": 79.48023822414727}, {"type": "manhattan_ap", "value": 84.72928897427576}, {"type": "manhattan_f1", "value": 81.32084770823064}, {"type": "manhattan_precision", "value": 76.24768946395564}, {"type": "manhattan_recall", "value": 87.11721224920802}, {"type": "max_accuracy", "value": 79.64266377910124}, {"type": "max_ap", "value": 84.7851404063692}, {"type": "max_f1", "value": 81.32084770823064}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB OnlineShopping", "type": "C-MTEB/OnlineShopping-classification", "config": "default", "split": "test", "revision": "e610f2ebd179a8fda30ae534c3878750a96db120"}, "metrics": [{"type": "accuracy", "value": 94.3}, {"type": "ap", "value": 92.8664032274438}, {"type": "f1", "value": 94.29311102997727}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB PAWSX", "type": "C-MTEB/PAWSX", "config": "default", "split": "test", "revision": "9c6a90e430ac22b5779fb019a23e820b11a8b5e1"}, "metrics": [{"type": "cos_sim_pearson", "value": 48.51392279882909}, {"type": "cos_sim_spearman", "value": 54.06338895994974}, {"type": "euclidean_pearson", "value": 52.58480559573412}, {"type": "euclidean_spearman", "value": 54.06417276612201}, {"type": "manhattan_pearson", "value": 52.69525121721343}, {"type": "manhattan_spearman", "value": 54.048147455389675}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB QBQTC", "type": "C-MTEB/QBQTC", "config": "default", "split": "test", "revision": "790b0510dc52b1553e8c49f3d2afb48c0e5c48b7"}, "metrics": [{"type": "cos_sim_pearson", "value": 29.728387290757325}, {"type": "cos_sim_spearman", "value": 31.366121633635284}, {"type": "euclidean_pearson", "value": 29.14588368552961}, {"type": "euclidean_spearman", "value": 31.36764411112844}, {"type": "manhattan_pearson", "value": 29.63517350523121}, {"type": "manhattan_spearman", "value": 31.94157020583762}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (zh)", "type": "mteb/sts22-crosslingual-sts", "config": "zh", "split": "test", "revision": "eea2b4fe26a775864c896887d910b76a8098ad3f"}, "metrics": [{"type": "cos_sim_pearson", "value": 63.64868296271406}, {"type": "cos_sim_spearman", "value": 66.12800618164744}, {"type": "euclidean_pearson", "value": 63.21405767340238}, {"type": "euclidean_spearman", "value": 66.12786567790748}, {"type": "manhattan_pearson", "value": 64.04300276525848}, {"type": "manhattan_spearman", "value": 66.5066857145652}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSB", "type": "C-MTEB/STSB", "config": "default", "split": "test", "revision": "0cde68302b3541bb8b3c340dc0644b0b745b3dc0"}, "metrics": [{"type": "cos_sim_pearson", "value": 81.2302623912794}, {"type": "cos_sim_spearman", "value": 81.16833673266562}, {"type": "euclidean_pearson", "value": 79.47647843876024}, {"type": "euclidean_spearman", "value": 81.16944349524972}, {"type": "manhattan_pearson", "value": 79.84947238492208}, {"type": "manhattan_spearman", "value": 81.64626599410026}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB T2Reranking", "type": "C-MTEB/T2Reranking", "config": "default", "split": "dev", "revision": "76631901a18387f85eaa53e5450019b87ad58ef9"}, "metrics": [{"type": "map", "value": 67.80129586475687}, {"type": "mrr", "value": 77.77402311635554}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB T2Retrieval", "type": "C-MTEB/T2Retrieval", "config": "default", "split": "dev", "revision": "8731a845f1bf500a4f111cf1070785c793d10e64"}, "metrics": [{"type": "map_at_1", "value": 28.666999999999998}, {"type": "map_at_10", "value": 81.063}, {"type": "map_at_100", "value": 84.504}, {"type": "map_at_1000", "value": 84.552}, {"type": "map_at_3", "value": 56.897}, {"type": "map_at_5", "value": 70.073}, {"type": "mrr_at_1", "value": 92.087}, {"type": "mrr_at_10", "value": 94.132}, {"type": "mrr_at_100", "value": 94.19800000000001}, {"type": "mrr_at_1000", "value": 94.19999999999999}, {"type": "mrr_at_3", "value": 93.78999999999999}, {"type": "mrr_at_5", "value": 94.002}, {"type": "ndcg_at_1", "value": 92.087}, {"type": "ndcg_at_10", "value": 87.734}, {"type": "ndcg_at_100", "value": 90.736}, {"type": "ndcg_at_1000", "value": 91.184}, {"type": "ndcg_at_3", "value": 88.78}, {"type": "ndcg_at_5", "value": 87.676}, {"type": "precision_at_1", "value": 92.087}, {"type": "precision_at_10", "value": 43.46}, {"type": "precision_at_100", "value": 5.07}, {"type": "precision_at_1000", "value": 0.518}, {"type": "precision_at_3", "value": 77.49000000000001}, {"type": "precision_at_5", "value": 65.194}, {"type": "recall_at_1", "value": 28.666999999999998}, {"type": "recall_at_10", "value": 86.632}, {"type": "recall_at_100", "value": 96.646}, {"type": "recall_at_1000", "value": 98.917}, {"type": "recall_at_3", "value": 58.333999999999996}, {"type": "recall_at_5", "value": 72.974}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB TNews", "type": "C-MTEB/TNews-classification", "config": "default", "split": "validation", "revision": "317f262bf1e6126357bbe89e875451e4b0938fe4"}, "metrics": [{"type": "accuracy", "value": 52.971999999999994}, {"type": "f1", "value": 50.2898280984929}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ThuNewsClusteringP2P", "type": "C-MTEB/ThuNewsClusteringP2P", "config": "default", "split": "test", "revision": "5798586b105c0434e4f0fe5e767abe619442cf93"}, "metrics": [{"type": "v_measure", "value": 86.0797948663824}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ThuNewsClusteringS2S", "type": "C-MTEB/ThuNewsClusteringS2S", "config": "default", "split": "test", "revision": "8a8b2caeda43f39e13c4bc5bea0f8a667896e10d"}, "metrics": [{"type": "v_measure", "value": 85.10759092255017}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB VideoRetrieval", "type": "C-MTEB/VideoRetrieval", "config": "default", "split": "dev", "revision": "58c2597a5943a2ba48f4668c3b90d796283c5639"}, "metrics": [{"type": "map_at_1", "value": 65.60000000000001}, {"type": "map_at_10", "value": 74.773}, {"type": "map_at_100", "value": 75.128}, {"type": "map_at_1000", "value": 75.136}, {"type": "map_at_3", "value": 73.05}, {"type": "map_at_5", "value": 74.13499999999999}, {"type": "mrr_at_1", "value": 65.60000000000001}, {"type": "mrr_at_10", "value": 74.773}, {"type": "mrr_at_100", "value": 75.128}, {"type": "mrr_at_1000", "value": 75.136}, {"type": "mrr_at_3", "value": 73.05}, {"type": "mrr_at_5", "value": 74.13499999999999}, {"type": "ndcg_at_1", "value": 65.60000000000001}, {"type": "ndcg_at_10", "value": 78.84299999999999}, {"type": "ndcg_at_100", "value": 80.40899999999999}, {"type": "ndcg_at_1000", "value": 80.57}, {"type": "ndcg_at_3", "value": 75.40599999999999}, {"type": "ndcg_at_5", "value": 77.351}, {"type": "precision_at_1", "value": 65.60000000000001}, {"type": "precision_at_10", "value": 9.139999999999999}, {"type": "precision_at_100", "value": 0.984}, {"type": "precision_at_1000", "value": 0.1}, {"type": "precision_at_3", "value": 27.400000000000002}, {"type": "precision_at_5", "value": 17.380000000000003}, {"type": "recall_at_1", "value": 65.60000000000001}, {"type": "recall_at_10", "value": 91.4}, {"type": "recall_at_100", "value": 98.4}, {"type": "recall_at_1000", "value": 99.6}, {"type": "recall_at_3", "value": 82.19999999999999}, {"type": "recall_at_5", "value": 86.9}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB Waimai", "type": "C-MTEB/waimai-classification", "config": "default", "split": "test", "revision": "339287def212450dcaa9df8c22bf93e9980c7023"}, "metrics": [{"type": "accuracy", "value": 89.47}, {"type": "ap", "value": 75.59561751845389}, {"type": "f1", "value": 87.95207751382563}]}]}]} |
LiteLLMs/Llama3-OpenBioLLM-8B-GGUF | LiteLLMs | null | [
"gguf",
"llama-3",
"llama",
"Mixtral",
"instruct",
"finetune",
"chatml",
"DPO",
"RLHF",
"gpt4",
"distillation",
"GGUF",
"en",
"arxiv:2305.18290",
"arxiv:2303.13375",
"arxiv:2212.13138",
"arxiv:2305.09617",
"arxiv:2402.07023",
"base_model:meta-llama/Meta-Llama-3-8B",
"base_model:quantized:meta-llama/Meta-Llama-3-8B",
"license:llama3",
"endpoints_compatible",
"region:us"
]
| 2024-04-29T16:51:27 | 2024-04-29T17:55:24 | 25 | 0 | ---
base_model: meta-llama/Meta-Llama-3-8B
language:
- en
license: llama3
tags:
- llama-3
- llama
- Mixtral
- instruct
- finetune
- chatml
- DPO
- RLHF
- gpt4
- distillation
- GGUF
widget:
- example_title: OpenBioLLM-8B
messages:
- role: system
content: You are an expert and experienced from the healthcare and biomedical
domain with extensive medical knowledge and practical experience.
- role: user
content: How long does it take for newborn jaundice to go away?
output:
text: 'Newborn jaundice, also known as neonatal jaundice, is a common condition
in newborns where the yellowing of the skin and eyes occurs due to an elevated
level of bilirubin in the blood. Bilirubin is a yellow pigment that forms when
red blood cells break down. In most cases, newborn jaundice resolves on its
own without any specific treatment.
The duration of newborn jaundice can vary depending on several factors such
as the underlying cause, gestational age at birth, and individual variations
in bilirubin metabolism. Here are some general guidelines:
1. Physiological jaundice: This is the most common type of newborn jaundice
and usually appears within 24-72 hours after birth. It tends to peak between
the second and fifth day of life and gradually improves over the next week or
two. By the time the baby is one week old, the jaundice should have mostly resolved.
2. Breast milk jaundice: This type of jaundice occurs in breastfed babies and
may appear later than physiological jaundice, typically between the fifth and
fourteenth day of life. It tends to persist for a longer duration but usually
resolves within six weeks after birth. 3. Pathological jaundice: This type of
jaundice is less common and occurs due to an underlying medical condition that
affects bilirubin metabolism or liver function. The duration of pathological
jaundice depends on the specific cause and may require treatment.
It''s important for parents to monitor their newborn''s jaundice closely and
seek medical advice if the jaundice progresses rapidly, becomes severe, or is
accompanied by other symptoms such as poor feeding, lethargy, or excessive sleepiness.
In these cases, further evaluation and management may be necessary. Remember
that each baby is unique, and the timing of jaundice resolution can vary. If
you have concerns about your newborn''s jaundice, it''s always best to consult
with a healthcare professional for personalized advice and guidance.'
quantized_by: andrijdavid
model-index:
- name: OpenBioLLM-8B
results: []
---
# Llama3-OpenBioLLM-8B-GGUF
- Original model: [Llama3-OpenBioLLM-8B](https://huggingface.co/aaditya/Llama3-OpenBioLLM-8B)
<!-- description start -->
## Description
This repo contains GGUF format model files for [Llama3-OpenBioLLM-8B](https://huggingface.co/aaditya/Llama3-OpenBioLLM-8B).
<!-- description end -->
<!-- README_GGUF.md-about-gguf start -->
### About GGUF
GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* [llama.cpp](https://github.com/ggerganov/llama.cpp). This is the source project for GGUF, providing both a Command Line Interface (CLI) and a server option.
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), Known as the most widely used web UI, this project boasts numerous features and powerful extensions, and supports GPU acceleration.
* [Ollama](https://github.com/jmorganca/ollama) Ollama is a lightweight and extensible framework designed for building and running language models locally. It features a simple API for creating, managing, and executing models, along with a library of pre-built models for use in various applications
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), A comprehensive web UI offering GPU acceleration across all platforms and architectures, particularly renowned for storytelling.
* [GPT4All](https://gpt4all.io), This is a free and open source GUI that runs locally, supporting Windows, Linux, and macOS with full GPU acceleration.
* [LM Studio](https://lmstudio.ai/) An intuitive and powerful local GUI for Windows and macOS (Silicon), featuring GPU acceleration.
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui). A notable web UI with a variety of unique features, including a comprehensive model library for easy model selection.
* [Faraday.dev](https://faraday.dev/), An attractive, user-friendly character-based chat GUI for Windows and macOS (both Silicon and Intel), also offering GPU acceleration.
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), A Python library equipped with GPU acceleration, LangChain support, and an OpenAI-compatible API server.
* [candle](https://github.com/huggingface/candle), A Rust-based ML framework focusing on performance, including GPU support, and designed for ease of use.
* [ctransformers](https://github.com/marella/ctransformers), A Python library featuring GPU acceleration, LangChain support, and an OpenAI-compatible AI server.
* [localGPT](https://github.com/PromtEngineer/localGPT) An open-source initiative enabling private conversations with documents.
<!-- README_GGUF.md-about-gguf end -->
<!-- compatibility_gguf start -->
## Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw.
</details>
<!-- compatibility_gguf end -->
<!-- README_GGUF.md-how-to-download start -->
## How to download GGUF files
**Note for manual downloaders:** You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single folder.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* Faraday.dev
### In `text-generation-webui`
Under Download Model, you can enter the model repo: LiteLLMs/Llama3-OpenBioLLM-8B-GGUF and below it, a specific filename to download, such as: Q4_0/Q4_0-00001-of-00009.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
Then you can download any individual model file to the current directory, at high speed, with a command like this:
```shell
huggingface-cli download LiteLLMs/Llama3-OpenBioLLM-8B-GGUF Q4_0/Q4_0-00001-of-00009.gguf --local-dir . --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage (click to read)</summary>
You can also download multiple files at once with a pattern:
```shell
huggingface-cli download LiteLLMs/Llama3-OpenBioLLM-8B-GGUF --local-dir . --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install huggingface_hub[hf_transfer]
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download LiteLLMs/Llama3-OpenBioLLM-8B-GGUF Q4_0/Q4_0-00001-of-00009.gguf --local-dir . --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
<!-- README_GGUF.md-how-to-download end -->
<!-- README_GGUF.md-how-to-run start -->
## Example `llama.cpp` command
Make sure you are using `llama.cpp` from commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221) or later.
```shell
./main -ngl 35 -m Q4_0/Q4_0-00001-of-00009.gguf --color -c 8192 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "<PROMPT>"
```
Change `-ngl 32` to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change `-c 8192` to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by llama.cpp automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.
If you want to have a chat-style conversation, replace the `-p <PROMPT>` argument with `-i -ins`
For other parameters and how to use them, please refer to [the llama.cpp documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
## How to run in `text-generation-webui`
Further instructions can be found in the text-generation-webui documentation, here: [text-generation-webui/docs/04 ‐ Model Tab.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/04%20%E2%80%90%20Model%20Tab.md#llamacpp).
## How to run from Python code
You can use GGUF models from Python using the [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) or [ctransformers](https://github.com/marella/ctransformers) libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.
### How to load this model in Python code, using llama-cpp-python
For full documentation, please see: [llama-cpp-python docs](https://abetlen.github.io/llama-cpp-python/).
#### First install the package
Run one of the following commands, according to your system:
```shell
# Base ctransformers with no GPU acceleration
pip install llama-cpp-python
# With NVidia CUDA acceleration
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python
# Or with OpenBLAS acceleration
CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python
# Or with CLBLast acceleration
CMAKE_ARGS="-DLLAMA_CLBLAST=on" pip install llama-cpp-python
# Or with AMD ROCm GPU acceleration (Linux only)
CMAKE_ARGS="-DLLAMA_HIPBLAS=on" pip install llama-cpp-python
# Or with Metal GPU acceleration for macOS systems only
CMAKE_ARGS="-DLLAMA_METAL=on" pip install llama-cpp-python
# In windows, to set the variables CMAKE_ARGS in PowerShell, follow this format; eg for NVidia CUDA:
$env:CMAKE_ARGS = "-DLLAMA_OPENBLAS=on"
pip install llama-cpp-python
```
#### Simple llama-cpp-python example code
```python
from llama_cpp import Llama
# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
llm = Llama(
model_path="./Q4_0/Q4_0-00001-of-00009.gguf", # Download the model file first
n_ctx=32768, # The max sequence length to use - note that longer sequence lengths require much more resources
n_threads=8, # The number of CPU threads to use, tailor to your system and the resulting performance
n_gpu_layers=35 # The number of layers to offload to GPU, if you have GPU acceleration available
)
# Simple inference example
output = llm(
"<PROMPT>", # Prompt
max_tokens=512, # Generate up to 512 tokens
stop=["</s>"], # Example stop token - not necessarily correct for this specific model! Please check before using.
echo=True # Whether to echo the prompt
)
# Chat Completion API
llm = Llama(model_path="./Q4_0/Q4_0-00001-of-00009.gguf", chat_format="llama-2") # Set chat_format according to the model you are using
llm.create_chat_completion(
messages = [
{"role": "system", "content": "You are a story writing assistant."},
{
"role": "user",
"content": "Write a story about llamas."
}
]
)
```
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* [LangChain + llama-cpp-python](https://python.langchain.com/docs/integrations/llms/llamacpp)
* [LangChain + ctransformers](https://python.langchain.com/docs/integrations/providers/ctransformers)
<!-- README_GGUF.md-how-to-run end -->
<!-- footer end -->
<!-- original-model-card start -->
# Original model card: Llama3-OpenBioLLM-8B
<div align="center">
<img width="260px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/BrQCb95lmEIFz79QAmoNA.png"></div>

<div align="center">
<h1>Advancing Open-source Large Language Models in Medical Domain</h1>
</div>
<p align="center" style="margin-top: 0px;">
<a href="https://colab.research.google.com/drive/1F5oV20InEYeAJGmBwYF9NM_QhLmjBkKJ?usp=sharing">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="OpenChat Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 10px; margin-top: 0px; margin-bottom: 0px;"/>
<span class="link-text" style=" margin-right: 5px;">Online Demo</span>
</a> |
<a href="https://github.com/openlifescience-ai">
<img src="https://github.githubassets.com/assets/GitHub-Mark-ea2971cee799.png" alt="GitHub Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/>
<span class="link-text" style=" margin-right: 5px;">GitHub</span>
</a> |
<a href="#">
<img src="https://github.com/alpayariyak/openchat/blob/master/assets/arxiv-logomark-small-square-border.png?raw=true" alt="ArXiv Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/>
<span class="link-text" style="margin-right: 5px;">Paper</span>
</a> |
<a href="https://discord.gg/A5Fjf5zC69">
<img src="https://cloud.githubusercontent.com/assets/6291467/26705903/96c2d66e-477c-11e7-9f4e-f3c0efe96c9a.png" alt="Discord Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/>
<span class="link-text">Discord</span>
</a>
</p>

Introducing OpenBioLLM-8B: A State-of-the-Art Open Source Biomedical Large Language Model
OpenBioLLM-8B is an advanced open source language model designed specifically for the biomedical domain. Developed by Saama AI Labs, this model leverages cutting-edge techniques to achieve state-of-the-art performance on a wide range of biomedical tasks.
🏥 **Biomedical Specialization**: OpenBioLLM-8B is tailored for the unique language and knowledge requirements of the medical and life sciences fields. It was fine-tuned on a vast corpus of high-quality biomedical data, enabling it to understand and generate text with domain-specific accuracy and fluency.
🎓 **Superior Performance**: With 8 billion parameters, OpenBioLLM-8B outperforms other open source biomedical language models of similar scale. It has also demonstrated better results compared to larger proprietary & open-source models like GPT-3.5 and Meditron-70B on biomedical benchmarks.
🧠 **Advanced Training Techniques**: OpenBioLLM-8B builds upon the powerful foundations of the **Meta-Llama-3-8B** and [Meta-Llama-3-8B](meta-llama/Meta-Llama-3-8B) models. It incorporates the DPO dataset and fine-tuning recipe along with a custom diverse medical instruction dataset. Key components of the training pipeline include:
<div align="center">
<img width="1200px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/oPchsJsEpQoGcGXVbh7YS.png">
</div>
- **Policy Optimization**: [Direct Preference Optimization: Your Language Model is Secretly a Reward Model (DPO)](https://arxiv.org/abs/2305.18290)
- **Ranking Dataset**: [berkeley-nest/Nectar](https://huggingface.co/datasets/berkeley-nest/Nectar)
- **Fine-tuning dataset**: Custom Medical Instruct dataset (We plan to release a sample training dataset in our upcoming paper; please stay updated)
This combination of cutting-edge techniques enables OpenBioLLM-8B to align with key capabilities and preferences for biomedical applications.
⚙️ **Release Details**:
- **Model Size**: 8 billion parameters
- **Quantization**: Optimized quantized versions available [Here](https://huggingface.co/aaditya/OpenBioLLM-Llama3-8B-GGUF)
- **Language(s) (NLP):** en
- **Developed By**: [Ankit Pal (Aaditya Ura)](https://aadityaura.github.io/) from Saama AI Labs
- **License:** Meta-Llama License
- **Fine-tuned from models:** [meta-llama/Meta-Llama-3-8B](meta-llama/Meta-Llama-3-8B)
- **Resources for more information:**
- Paper: Coming soon
The model can be fine-tuned for more specialized tasks and datasets as needed.
OpenBioLLM-8B represents an important step forward in democratizing advanced language AI for the biomedical community. By leveraging state-of-the-art architectures and training techniques from leading open source efforts like Llama-3, we have created a powerful tool to accelerate innovation and discovery in healthcare and the life sciences.
We are excited to share OpenBioLLM-8B with researchers and developers around the world.
### Use with transformers
**Important: Please use the exact chat template provided by Llama-3 instruct version. Otherwise there will be a degradation in the performance. The model output can be verbose in rare cases. Please consider setting temperature = 0 to make this happen less.**
See the snippet below for usage with Transformers:
```python
import transformers
import torch
model_id = "aaditya/OpenBioLLM-Llama3-8B"
pipeline = transformers.pipeline(
"text-generation",
model=model_id,
model_kwargs={"torch_dtype": torch.bfloat16},
device="auto",
)
messages = [
{"role": "system", "content": "You are an expert and experienced from the healthcare and biomedical domain with extensive medical knowledge and practical experience. Your name is OpenBioLLM, and you were developed by Saama AI Labs. who's willing to help answer the user's query with explanation. In your explanation, leverage your deep medical expertise such as relevant anatomical structures, physiological processes, diagnostic criteria, treatment guidelines, or other pertinent medical concepts. Use precise medical terminology while still aiming to make the explanation clear and accessible to a general audience."},
{"role": "user", "content": "How can i split a 3mg or 4mg waefin pill so i can get a 2.5mg pill?"},
]
prompt = pipeline.tokenizer.apply_chat_template(
messages,
tokenize=False,
add_generation_prompt=True
)
terminators = [
pipeline.tokenizer.eos_token_id,
pipeline.tokenizer.convert_tokens_to_ids("<|eot_id|>")
]
outputs = pipeline(
prompt,
max_new_tokens=256,
eos_token_id=terminators,
do_sample=True,
temperature=0.0,
top_p=0.9,
)
print(outputs[0]["generated_text"][len(prompt):])
```
## **Training procedure**
### **Training hyperparameters**
<details>
<summary>Click to see details</summary>
- learning_rate: 0.0002
- lr_scheduler: cosine
- train_batch_size: 12
- eval_batch_size: 8
- GPU: H100 80GB SXM5
- num_devices: 1
- optimizer: adamw_bnb_8bit
- lr_scheduler_warmup_steps: 100
- num_epochs: 4
</details>
### **Peft hyperparameters**
<details>
<summary>Click to see details</summary>
- adapter: qlora
- lora_r: 128
- lora_alpha: 256
- lora_dropout: 0.05
- lora_target_linear: true
-lora_target_modules:
- q_proj
- v_proj
- k_proj
- o_proj
- gate_proj
- down_proj
- up_proj
</details>
### **Training results**
### **Framework versions**
- Transformers 4.39.3
- Pytorch 2.1.2+cu121
- Datasets 2.18.0
- Tokenizers 0.15.1
- Axolotl
- Lm harness for evaluation
# Benchmark Results
🔥 OpenBioLLM-8B demonstrates superior performance compared to larger models, such as GPT-3.5, Meditron-70B across 9 diverse biomedical datasets, achieving state-of-the-art results with an average score of 72.50%, despite having a significantly smaller parameter count. The model's strong performance in domain-specific tasks, such as Clinical KG, Medical Genetics, and PubMedQA, highlights its ability to effectively capture and apply biomedical knowledge.
🚨 The GPT-4, Med-PaLM-1, and Med-PaLM-2 results are taken from their official papers. Since Med-PaLM doesn't provide zero-shot accuracy, we are using 5-shot accuracy from their paper for comparison. All results presented are in the zero-shot setting, except for Med-PaLM-2 and Med-PaLM-1, which use 5-shot accuracy.
| | Clinical KG | Medical Genetics | Anatomy | Pro Medicine | College Biology | College Medicine | MedQA 4 opts | PubMedQA | MedMCQA | Avg |
| | - | | - | |
| **OpenBioLLM-70B** | **92.93** | **93.197** | **83.904** | 93.75 | 93.827 | **85.749** | 78.162 | 78.97 | **74.014** | **86.05588** |
| Med-PaLM-2 (5-shot) | 88.3 | 90 | 77.8 | **95.2** | 94.4 | 80.9 | **79.7** | **79.2** | 71.3 | 84.08 |
| **GPT-4** | 86.04 | 91 | 80 | 93.01 | **95.14** | 76.88 | 78.87 | 75.2 | 69.52 | 82.85 |
| Med-PaLM-1 (Flan-PaLM, 5-shot) | 80.4 | 75 | 63.7 | 83.8 | 88.9 | 76.3 | 67.6 | 79 | 57.6 | 74.7 |
| **OpenBioLLM-8B** | 76.101 | 86.1 | 69.829 | 78.21 | 84.213 | 68.042 | 58.993 | 74.12 | 56.913 | 72.502 |
| Gemini-1.0 | 76.7 | 75.8 | 66.7 | 77.7 | 88 | 69.2 | 58 | 70.7 | 54.3 | 70.79 |
| GPT-3.5 Turbo 1106 | 74.71 | 74 | 72.79 | 72.79 | 72.91 | 64.73 | 57.71 | 72.66 | 53.79 | 66 |
| Meditron-70B | 66.79 | 69 | 53.33 | 71.69 | 76.38 | 63 | 57.1 | 76.6 | 46.85 | 64.52 |
| gemma-7b | 69.81 | 70 | 59.26 | 66.18 | 79.86 | 60.12 | 47.21 | 76.2 | 48.96 | 64.18 |
| Mistral-7B-v0.1 | 68.68 | 71 | 55.56 | 68.38 | 68.06 | 59.54 | 50.82 | 75.4 | 48.2 | 62.85 |
| Apollo-7B | 62.26 | 72 | 61.48 | 69.12 | 70.83 | 55.49 | 55.22 | 39.8 | 53.77 | 60 |
| MedAlpaca-7b | 57.36 | 69 | 57.04 | 67.28 | 65.28 | 54.34 | 41.71 | 72.8 | 37.51 | 58.03 |
| BioMistral-7B | 59.9 | 64 | 56.5 | 60.4 | 59 | 54.7 | 50.6 | 77.5 | 48.1 | 57.3 |
| AlpaCare-llama2-7b | 49.81 | 49 | 45.92 | 33.82 | 50 | 43.35 | 29.77 | 72.2 | 34.42 | 45.36 |
| ClinicalGPT | 30.56 | 27 | 30.37 | 19.48 | 25 | 24.27 | 26.08 | 63.8 | 28.18 | 30.52 |
<div align="center">
<img width="1600px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/_SzdcJSBjZyo8RS1bTEkP.png">
</div>
## Detailed Medical Subjectwise accuracy

# Use Cases & Examples
🚨 **Below results are from the quantized version of OpenBioLLM-70B**
# Summarize Clinical Notes
OpenBioLLM-70B can efficiently analyze and summarize complex clinical notes, EHR data, and discharge summaries, extracting key information and generating concise, structured summaries

# Answer Medical Questions
OpenBioLLM-70B can provide answers to a wide range of medical questions.


<details>
<summary>Click to see details</summary>



</details>
# Clinical Entity Recognition
OpenBioLLM-70B can perform advanced clinical entity recognition by identifying and extracting key medical concepts, such as diseases, symptoms, medications, procedures, and anatomical structures, from unstructured clinical text. By leveraging its deep understanding of medical terminology and context, the model can accurately annotate and categorize clinical entities, enabling more efficient information retrieval, data analysis, and knowledge discovery from electronic health records, research articles, and other biomedical text sources. This capability can support various downstream applications, such as clinical decision support, pharmacovigilance, and medical research.



# Biomarkers Extraction

# Classification
OpenBioLLM-70B can perform various biomedical classification tasks, such as disease prediction, sentiment analysis, medical document categorization

# De-Identification
OpenBioLLM-70B can detect and remove personally identifiable information (PII) from medical records, ensuring patient privacy and compliance with data protection regulations like HIPAA.

**Advisory Notice!**
While OpenBioLLM-70B & 8B leverages high-quality data sources, its outputs may still contain inaccuracies, biases, or misalignments that could pose risks if relied upon for medical decision-making without further testing and refinement. The model's performance has not yet been rigorously evaluated in randomized controlled trials or real-world healthcare environments.
Therefore, we strongly advise against using OpenBioLLM-70B & 8B for any direct patient care, clinical decision support, or other professional medical purposes at this time. Its use should be limited to research, development, and exploratory applications by qualified individuals who understand its limitations.
OpenBioLLM-70B & 8B are intended solely as a research tool to assist healthcare professionals and should never be considered a replacement for the professional judgment and expertise of a qualified medical doctor.
Appropriately adapting and validating OpenBioLLM-70B & 8B for specific medical use cases would require significant additional work, potentially including:
- Thorough testing and evaluation in relevant clinical scenarios
- Alignment with evidence-based guidelines and best practices
- Mitigation of potential biases and failure modes
- Integration with human oversight and interpretation
- Compliance with regulatory and ethical standards
Always consult a qualified healthcare provider for personal medical needs.
# Citation
If you find OpenBioLLM-70B & 8B useful in your work, please cite the model as follows:
```
@misc{OpenBioLLMs,
author = {Ankit Pal, Malaikannan Sankarasubbu},
title = {OpenBioLLMs: Advancing Open-Source Large Language Models for Healthcare and Life Sciences},
year = {2024},
publisher = {Hugging Face},
journal = {Hugging Face repository},
howpublished = {\url{https://huggingface.co/aaditya/OpenBioLLM-Llama3-70B}}
}
```
The accompanying paper is currently in progress and will be released soon.
<div align="center">
<h2> 💌 Contact </h2>
</div>
We look forward to hearing you and collaborating on this exciting project!
**Contributors:**
- [Ankit Pal (Aaditya Ura)](https://aadityaura.github.io/) [aadityaura at gmail dot com]
- Saama AI Labs
- Note: I am looking for a funded PhD opportunity, especially if it fits my Responsible Generative AI, Multimodal LLMs, Geometric Deep Learning, and Healthcare AI skillset.
# References
We thank the [Meta Team](meta-llama/Meta-Llama-3-70B-Instruct) for their amazing models!
Result sources
- [1] GPT-4 [Capabilities of GPT-4 on Medical Challenge Problems] (https://arxiv.org/abs/2303.13375)
- [2] Med-PaLM-1 [Large Language Models Encode Clinical Knowledge](https://arxiv.org/abs/2212.13138)
- [3] Med-PaLM-2 [Towards Expert-Level Medical Question Answering with Large Language Models](https://arxiv.org/abs/2305.09617)
- [4] Gemini-1.0 [Gemini Goes to Med School](https://arxiv.org/abs/2402.07023)
<!-- original-model-card end -->
| [
"QUESTION_ANSWERING"
]
| [
"MEDQA",
"PUBMEDQA"
]
| BioNLP | # Llama3-OpenBioLLM-8B-GGUF
- Original model: [Llama3-OpenBioLLM-8B](https://huggingface.co/aaditya/Llama3-OpenBioLLM-8B)
<!-- description start -->
## Description
This repo contains GGUF format model files for [Llama3-OpenBioLLM-8B](https://huggingface.co/aaditya/Llama3-OpenBioLLM-8B).
<!-- description end -->
<!-- README_GGUF.md-about-gguf start -->
### About GGUF
GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* [llama.cpp](https://github.com/ggerganov/llama.cpp). This is the source project for GGUF, providing both a Command Line Interface (CLI) and a server option.
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), Known as the most widely used web UI, this project boasts numerous features and powerful extensions, and supports GPU acceleration.
* [Ollama](https://github.com/jmorganca/ollama) Ollama is a lightweight and extensible framework designed for building and running language models locally. It features a simple API for creating, managing, and executing models, along with a library of pre-built models for use in various applications
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), A comprehensive web UI offering GPU acceleration across all platforms and architectures, particularly renowned for storytelling.
* [GPT4All](https://gpt4all.io), This is a free and open source GUI that runs locally, supporting Windows, Linux, and macOS with full GPU acceleration.
* [LM Studio](https://lmstudio.ai/) An intuitive and powerful local GUI for Windows and macOS (Silicon), featuring GPU acceleration.
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui). A notable web UI with a variety of unique features, including a comprehensive model library for easy model selection.
* [Faraday.dev](https://faraday.dev/), An attractive, user-friendly character-based chat GUI for Windows and macOS (both Silicon and Intel), also offering GPU acceleration.
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), A Python library equipped with GPU acceleration, LangChain support, and an OpenAI-compatible API server.
* [candle](https://github.com/huggingface/candle), A Rust-based ML framework focusing on performance, including GPU support, and designed for ease of use.
* [ctransformers](https://github.com/marella/ctransformers), A Python library featuring GPU acceleration, LangChain support, and an OpenAI-compatible AI server.
* [localGPT](https://github.com/PromtEngineer/localGPT) An open-source initiative enabling private conversations with documents.
<!-- README_GGUF.md-about-gguf end -->
<!-- compatibility_gguf start -->
## Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw.
</details>
<!-- compatibility_gguf end -->
<!-- README_GGUF.md-how-to-download start -->
## How to download GGUF files
**Note for manual downloaders:** You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single folder.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* Faraday.dev
### In `text-generation-webui`
Under Download Model, you can enter the model repo: LiteLLMs/Llama3-OpenBioLLM-8B-GGUF and below it, a specific filename to download, such as: Q4_0/Q4_0-00001-of-00009.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
Then you can download any individual model file to the current directory, at high speed, with a command like this:
```shell
huggingface-cli download LiteLLMs/Llama3-OpenBioLLM-8B-GGUF Q4_0/Q4_0-00001-of-00009.gguf --local-dir . --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage (click to read)</summary>
You can also download multiple files at once with a pattern:
```shell
huggingface-cli download LiteLLMs/Llama3-OpenBioLLM-8B-GGUF --local-dir . --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install huggingface_hub[hf_transfer]
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download LiteLLMs/Llama3-OpenBioLLM-8B-GGUF Q4_0/Q4_0-00001-of-00009.gguf --local-dir . --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
<!-- README_GGUF.md-how-to-download end -->
<!-- README_GGUF.md-how-to-run start -->
## Example `llama.cpp` command
Make sure you are using `llama.cpp` from commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221) or later.
```shell
./main -ngl 35 -m Q4_0/Q4_0-00001-of-00009.gguf --color -c 8192 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "<PROMPT>"
```
Change `-ngl 32` to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change `-c 8192` to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by llama.cpp automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.
If you want to have a chat-style conversation, replace the `-p <PROMPT>` argument with `-i -ins`
For other parameters and how to use them, please refer to [the llama.cpp documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
## How to run in `text-generation-webui`
Further instructions can be found in the text-generation-webui documentation, here: [text-generation-webui/docs/04 ‐ Model Tab.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/04%20%E2%80%90%20Model%20Tab.md#llamacpp).
## How to run from Python code
You can use GGUF models from Python using the [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) or [ctransformers](https://github.com/marella/ctransformers) libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.
### How to load this model in Python code, using llama-cpp-python
For full documentation, please see: [llama-cpp-python docs](https://abetlen.github.io/llama-cpp-python/).
#### First install the package
Run one of the following commands, according to your system:
```shell
# Base ctransformers with no GPU acceleration
pip install llama-cpp-python
# With NVidia CUDA acceleration
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python
# Or with OpenBLAS acceleration
CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python
# Or with CLBLast acceleration
CMAKE_ARGS="-DLLAMA_CLBLAST=on" pip install llama-cpp-python
# Or with AMD ROCm GPU acceleration (Linux only)
CMAKE_ARGS="-DLLAMA_HIPBLAS=on" pip install llama-cpp-python
# Or with Metal GPU acceleration for macOS systems only
CMAKE_ARGS="-DLLAMA_METAL=on" pip install llama-cpp-python
# In windows, to set the variables CMAKE_ARGS in PowerShell, follow this format; eg for NVidia CUDA:
$env:CMAKE_ARGS = "-DLLAMA_OPENBLAS=on"
pip install llama-cpp-python
```
#### Simple llama-cpp-python example code
```python
from llama_cpp import Llama
# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
llm = Llama(
model_path="./Q4_0/Q4_0-00001-of-00009.gguf", # Download the model file first
n_ctx=32768, # The max sequence length to use - note that longer sequence lengths require much more resources
n_threads=8, # The number of CPU threads to use, tailor to your system and the resulting performance
n_gpu_layers=35 # The number of layers to offload to GPU, if you have GPU acceleration available
)
# Simple inference example
output = llm(
"<PROMPT>", # Prompt
max_tokens=512, # Generate up to 512 tokens
stop=["</s>"], # Example stop token - not necessarily correct for this specific model! Please check before using.
echo=True # Whether to echo the prompt
)
# Chat Completion API
llm = Llama(model_path="./Q4_0/Q4_0-00001-of-00009.gguf", chat_format="llama-2") # Set chat_format according to the model you are using
llm.create_chat_completion(
messages = [
{"role": "system", "content": "You are a story writing assistant."},
{
"role": "user",
"content": "Write a story about llamas."
}
]
)
```
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* [LangChain + llama-cpp-python](https://python.langchain.com/docs/integrations/llms/llamacpp)
* [LangChain + ctransformers](https://python.langchain.com/docs/integrations/providers/ctransformers)
<!-- README_GGUF.md-how-to-run end -->
<!-- footer end -->
<!-- original-model-card start -->
# Original model card: Llama3-OpenBioLLM-8B
<div align="center">
<img width="260px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/BrQCb95lmEIFz79QAmoNA.png"></div>

<div align="center">
<h1>Advancing Open-source Large Language Models in Medical Domain</h1>
</div>
<p align="center" style="margin-top: 0px;">
<a href="https://colab.research.google.com/drive/1F5oV20InEYeAJGmBwYF9NM_QhLmjBkKJ?usp=sharing">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="OpenChat Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 10px; margin-top: 0px; margin-bottom: 0px;"/>
<span class="link-text" style=" margin-right: 5px;">Online Demo</span>
</a> |
<a href="https://github.com/openlifescience-ai">
<img src="https://github.githubassets.com/assets/GitHub-Mark-ea2971cee799.png" alt="GitHub Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/>
<span class="link-text" style=" margin-right: 5px;">GitHub</span>
</a> |
<a href="#">
<img src="https://github.com/alpayariyak/openchat/blob/master/assets/arxiv-logomark-small-square-border.png?raw=true" alt="ArXiv Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/>
<span class="link-text" style="margin-right: 5px;">Paper</span>
</a> |
<a href="https://discord.gg/A5Fjf5zC69">
<img src="https://cloud.githubusercontent.com/assets/6291467/26705903/96c2d66e-477c-11e7-9f4e-f3c0efe96c9a.png" alt="Discord Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/>
<span class="link-text">Discord</span>
</a>
</p>

Introducing OpenBioLLM-8B: A State-of-the-Art Open Source Biomedical Large Language Model
OpenBioLLM-8B is an advanced open source language model designed specifically for the biomedical domain. Developed by Saama AI Labs, this model leverages cutting-edge techniques to achieve state-of-the-art performance on a wide range of biomedical tasks.
🏥 **Biomedical Specialization**: OpenBioLLM-8B is tailored for the unique language and knowledge requirements of the medical and life sciences fields. It was fine-tuned on a vast corpus of high-quality biomedical data, enabling it to understand and generate text with domain-specific accuracy and fluency.
🎓 **Superior Performance**: With 8 billion parameters, OpenBioLLM-8B outperforms other open source biomedical language models of similar scale. It has also demonstrated better results compared to larger proprietary & open-source models like GPT-3.5 and Meditron-70B on biomedical benchmarks.
🧠 **Advanced Training Techniques**: OpenBioLLM-8B builds upon the powerful foundations of the **Meta-Llama-3-8B** and [Meta-Llama-3-8B](meta-llama/Meta-Llama-3-8B) models. It incorporates the DPO dataset and fine-tuning recipe along with a custom diverse medical instruction dataset. Key components of the training pipeline include:
<div align="center">
<img width="1200px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/oPchsJsEpQoGcGXVbh7YS.png">
</div>
- **Policy Optimization**: [Direct Preference Optimization: Your Language Model is Secretly a Reward Model (DPO)](https://arxiv.org/abs/2305.18290)
- **Ranking Dataset**: [berkeley-nest/Nectar](https://huggingface.co/datasets/berkeley-nest/Nectar)
- **Fine-tuning dataset**: Custom Medical Instruct dataset (We plan to release a sample training dataset in our upcoming paper; please stay updated)
This combination of cutting-edge techniques enables OpenBioLLM-8B to align with key capabilities and preferences for biomedical applications.
⚙️ **Release Details**:
- **Model Size**: 8 billion parameters
- **Quantization**: Optimized quantized versions available [Here](https://huggingface.co/aaditya/OpenBioLLM-Llama3-8B-GGUF)
- **Language(s) (NLP):** en
- **Developed By**: [Ankit Pal (Aaditya Ura)](https://aadityaura.github.io/) from Saama AI Labs
- **License:** Meta-Llama License
- **Fine-tuned from models:** [meta-llama/Meta-Llama-3-8B](meta-llama/Meta-Llama-3-8B)
- **Resources for more information:**
- Paper: Coming soon
The model can be fine-tuned for more specialized tasks and datasets as needed.
OpenBioLLM-8B represents an important step forward in democratizing advanced language AI for the biomedical community. By leveraging state-of-the-art architectures and training techniques from leading open source efforts like Llama-3, we have created a powerful tool to accelerate innovation and discovery in healthcare and the life sciences.
We are excited to share OpenBioLLM-8B with researchers and developers around the world.
### Use with transformers
**Important: Please use the exact chat template provided by Llama-3 instruct version. Otherwise there will be a degradation in the performance. The model output can be verbose in rare cases. Please consider setting temperature = 0 to make this happen less.**
See the snippet below for usage with Transformers:
```python
import transformers
import torch
model_id = "aaditya/OpenBioLLM-Llama3-8B"
pipeline = transformers.pipeline(
"text-generation",
model=model_id,
model_kwargs={"torch_dtype": torch.bfloat16},
device="auto",
)
messages = [
{"role": "system", "content": "You are an expert and experienced from the healthcare and biomedical domain with extensive medical knowledge and practical experience. Your name is OpenBioLLM, and you were developed by Saama AI Labs. who's willing to help answer the user's query with explanation. In your explanation, leverage your deep medical expertise such as relevant anatomical structures, physiological processes, diagnostic criteria, treatment guidelines, or other pertinent medical concepts. Use precise medical terminology while still aiming to make the explanation clear and accessible to a general audience."},
{"role": "user", "content": "How can i split a 3mg or 4mg waefin pill so i can get a 2.5mg pill?"},
]
prompt = pipeline.tokenizer.apply_chat_template(
messages,
tokenize=False,
add_generation_prompt=True
)
terminators = [
pipeline.tokenizer.eos_token_id,
pipeline.tokenizer.convert_tokens_to_ids("<|eot_id|>")
]
outputs = pipeline(
prompt,
max_new_tokens=256,
eos_token_id=terminators,
do_sample=True,
temperature=0.0,
top_p=0.9,
)
print(outputs[0]["generated_text"][len(prompt):])
```
## **Training procedure**
### **Training hyperparameters**
<details>
<summary>Click to see details</summary>
- learning_rate: 0.0002
- lr_scheduler: cosine
- train_batch_size: 12
- eval_batch_size: 8
- GPU: H100 80GB SXM5
- num_devices: 1
- optimizer: adamw_bnb_8bit
- lr_scheduler_warmup_steps: 100
- num_epochs: 4
</details>
### **Peft hyperparameters**
<details>
<summary>Click to see details</summary>
- adapter: qlora
- lora_r: 128
- lora_alpha: 256
- lora_dropout: 0.05
- lora_target_linear: true
-lora_target_modules:
- q_proj
- v_proj
- k_proj
- o_proj
- gate_proj
- down_proj
- up_proj
</details>
### **Training results**
### **Framework versions**
- Transformers 4.39.3
- Pytorch 2.1.2+cu121
- Datasets 2.18.0
- Tokenizers 0.15.1
- Axolotl
- Lm harness for evaluation
# Benchmark Results
🔥 OpenBioLLM-8B demonstrates superior performance compared to larger models, such as GPT-3.5, Meditron-70B across 9 diverse biomedical datasets, achieving state-of-the-art results with an average score of 72.50%, despite having a significantly smaller parameter count. The model's strong performance in domain-specific tasks, such as Clinical KG, Medical Genetics, and PubMedQA, highlights its ability to effectively capture and apply biomedical knowledge.
🚨 The GPT-4, Med-PaLM-1, and Med-PaLM-2 results are taken from their official papers. Since Med-PaLM doesn't provide zero-shot accuracy, we are using 5-shot accuracy from their paper for comparison. All results presented are in the zero-shot setting, except for Med-PaLM-2 and Med-PaLM-1, which use 5-shot accuracy.
| | Clinical KG | Medical Genetics | Anatomy | Pro Medicine | College Biology | College Medicine | MedQA 4 opts | PubMedQA | MedMCQA | Avg |
| | - | | - | |
| **OpenBioLLM-70B** | **92.93** | **93.197** | **83.904** | 93.75 | 93.827 | **85.749** | 78.162 | 78.97 | **74.014** | **86.05588** |
| Med-PaLM-2 (5-shot) | 88.3 | 90 | 77.8 | **95.2** | 94.4 | 80.9 | **79.7** | **79.2** | 71.3 | 84.08 |
| **GPT-4** | 86.04 | 91 | 80 | 93.01 | **95.14** | 76.88 | 78.87 | 75.2 | 69.52 | 82.85 |
| Med-PaLM-1 (Flan-PaLM, 5-shot) | 80.4 | 75 | 63.7 | 83.8 | 88.9 | 76.3 | 67.6 | 79 | 57.6 | 74.7 |
| **OpenBioLLM-8B** | 76.101 | 86.1 | 69.829 | 78.21 | 84.213 | 68.042 | 58.993 | 74.12 | 56.913 | 72.502 |
| Gemini-1.0 | 76.7 | 75.8 | 66.7 | 77.7 | 88 | 69.2 | 58 | 70.7 | 54.3 | 70.79 |
| GPT-3.5 Turbo 1106 | 74.71 | 74 | 72.79 | 72.79 | 72.91 | 64.73 | 57.71 | 72.66 | 53.79 | 66 |
| Meditron-70B | 66.79 | 69 | 53.33 | 71.69 | 76.38 | 63 | 57.1 | 76.6 | 46.85 | 64.52 |
| gemma-7b | 69.81 | 70 | 59.26 | 66.18 | 79.86 | 60.12 | 47.21 | 76.2 | 48.96 | 64.18 |
| Mistral-7B-v0.1 | 68.68 | 71 | 55.56 | 68.38 | 68.06 | 59.54 | 50.82 | 75.4 | 48.2 | 62.85 |
| Apollo-7B | 62.26 | 72 | 61.48 | 69.12 | 70.83 | 55.49 | 55.22 | 39.8 | 53.77 | 60 |
| MedAlpaca-7b | 57.36 | 69 | 57.04 | 67.28 | 65.28 | 54.34 | 41.71 | 72.8 | 37.51 | 58.03 |
| BioMistral-7B | 59.9 | 64 | 56.5 | 60.4 | 59 | 54.7 | 50.6 | 77.5 | 48.1 | 57.3 |
| AlpaCare-llama2-7b | 49.81 | 49 | 45.92 | 33.82 | 50 | 43.35 | 29.77 | 72.2 | 34.42 | 45.36 |
| ClinicalGPT | 30.56 | 27 | 30.37 | 19.48 | 25 | 24.27 | 26.08 | 63.8 | 28.18 | 30.52 |
<div align="center">
<img width="1600px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/_SzdcJSBjZyo8RS1bTEkP.png">
</div>
## Detailed Medical Subjectwise accuracy

# Use Cases & Examples
🚨 **Below results are from the quantized version of OpenBioLLM-70B**
# Summarize Clinical Notes
OpenBioLLM-70B can efficiently analyze and summarize complex clinical notes, EHR data, and discharge summaries, extracting key information and generating concise, structured summaries

# Answer Medical Questions
OpenBioLLM-70B can provide answers to a wide range of medical questions.


<details>
<summary>Click to see details</summary>



</details>
# Clinical Entity Recognition
OpenBioLLM-70B can perform advanced clinical entity recognition by identifying and extracting key medical concepts, such as diseases, symptoms, medications, procedures, and anatomical structures, from unstructured clinical text. By leveraging its deep understanding of medical terminology and context, the model can accurately annotate and categorize clinical entities, enabling more efficient information retrieval, data analysis, and knowledge discovery from electronic health records, research articles, and other biomedical text sources. This capability can support various downstream applications, such as clinical decision support, pharmacovigilance, and medical research.



# Biomarkers Extraction

# Classification
OpenBioLLM-70B can perform various biomedical classification tasks, such as disease prediction, sentiment analysis, medical document categorization

# De-Identification
OpenBioLLM-70B can detect and remove personally identifiable information (PII) from medical records, ensuring patient privacy and compliance with data protection regulations like HIPAA.

**Advisory Notice!**
While OpenBioLLM-70B & 8B leverages high-quality data sources, its outputs may still contain inaccuracies, biases, or misalignments that could pose risks if relied upon for medical decision-making without further testing and refinement. The model's performance has not yet been rigorously evaluated in randomized controlled trials or real-world healthcare environments.
Therefore, we strongly advise against using OpenBioLLM-70B & 8B for any direct patient care, clinical decision support, or other professional medical purposes at this time. Its use should be limited to research, development, and exploratory applications by qualified individuals who understand its limitations.
OpenBioLLM-70B & 8B are intended solely as a research tool to assist healthcare professionals and should never be considered a replacement for the professional judgment and expertise of a qualified medical doctor.
Appropriately adapting and validating OpenBioLLM-70B & 8B for specific medical use cases would require significant additional work, potentially including:
- Thorough testing and evaluation in relevant clinical scenarios
- Alignment with evidence-based guidelines and best practices
- Mitigation of potential biases and failure modes
- Integration with human oversight and interpretation
- Compliance with regulatory and ethical standards
Always consult a qualified healthcare provider for personal medical needs.
# Citation
If you find OpenBioLLM-70B & 8B useful in your work, please cite the model as follows:
```
@misc{OpenBioLLMs,
author = {Ankit Pal, Malaikannan Sankarasubbu},
title = {OpenBioLLMs: Advancing Open-Source Large Language Models for Healthcare and Life Sciences},
year = {2024},
publisher = {Hugging Face},
journal = {Hugging Face repository},
howpublished = {\url{https://huggingface.co/aaditya/OpenBioLLM-Llama3-70B}}
}
```
The accompanying paper is currently in progress and will be released soon.
<div align="center">
<h2> 💌 Contact </h2>
</div>
We look forward to hearing you and collaborating on this exciting project!
**Contributors:**
- [Ankit Pal (Aaditya Ura)](https://aadityaura.github.io/) [aadityaura at gmail dot com]
- Saama AI Labs
- Note: I am looking for a funded PhD opportunity, especially if it fits my Responsible Generative AI, Multimodal LLMs, Geometric Deep Learning, and Healthcare AI skillset.
# References
We thank the [Meta Team](meta-llama/Meta-Llama-3-70B-Instruct) for their amazing models!
Result sources
- [1] GPT-4 [Capabilities of GPT-4 on Medical Challenge Problems] (https://arxiv.org/abs/2303.13375)
- [2] Med-PaLM-1 [Large Language Models Encode Clinical Knowledge](https://arxiv.org/abs/2212.13138)
- [3] Med-PaLM-2 [Towards Expert-Level Medical Question Answering with Large Language Models](https://arxiv.org/abs/2305.09617)
- [4] Gemini-1.0 [Gemini Goes to Med School](https://arxiv.org/abs/2402.07023)
<!-- original-model-card end -->
| {"base_model": "meta-llama/Meta-Llama-3-8B", "language": ["en"], "license": "llama3", "tags": ["llama-3", "llama", "Mixtral", "instruct", "finetune", "chatml", "DPO", "RLHF", "gpt4", "distillation", "GGUF"], "widget": [{"example_title": "OpenBioLLM-8B", "messages": [{"role": "system", "content": "You are an expert and experienced from the healthcare and biomedical domain with extensive medical knowledge and practical experience."}, {"role": "user", "content": "How long does it take for newborn jaundice to go away?"}], "output": {"text": "Newborn jaundice, also known as neonatal jaundice, is a common condition in newborns where the yellowing of the skin and eyes occurs due to an elevated level of bilirubin in the blood. Bilirubin is a yellow pigment that forms when red blood cells break down. In most cases, newborn jaundice resolves on its own without any specific treatment.\nThe duration of newborn jaundice can vary depending on several factors such as the underlying cause, gestational age at birth, and individual variations in bilirubin metabolism. Here are some general guidelines:\n1. Physiological jaundice: This is the most common type of newborn jaundice and usually appears within 24-72 hours after birth. It tends to peak between the second and fifth day of life and gradually improves over the next week or two. By the time the baby is one week old, the jaundice should have mostly resolved. 2. Breast milk jaundice: This type of jaundice occurs in breastfed babies and may appear later than physiological jaundice, typically between the fifth and fourteenth day of life. It tends to persist for a longer duration but usually resolves within six weeks after birth. 3. Pathological jaundice: This type of jaundice is less common and occurs due to an underlying medical condition that affects bilirubin metabolism or liver function. The duration of pathological jaundice depends on the specific cause and may require treatment.\nIt's important for parents to monitor their newborn's jaundice closely and seek medical advice if the jaundice progresses rapidly, becomes severe, or is accompanied by other symptoms such as poor feeding, lethargy, or excessive sleepiness. In these cases, further evaluation and management may be necessary. Remember that each baby is unique, and the timing of jaundice resolution can vary. If you have concerns about your newborn's jaundice, it's always best to consult with a healthcare professional for personalized advice and guidance."}}], "quantized_by": "andrijdavid", "model-index": [{"name": "OpenBioLLM-8B", "results": []}]} |
twadada/nmc-cls-100_correct | twadada | null | [
"mteb",
"model-index",
"region:us"
]
| 2024-09-13T07:45:45 | 2024-09-13T07:45:57 | 0 | 0 | ---
tags:
- mteb
model-index:
- name: nomic_classification_100
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: None
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 70.35820895522387
- type: ap
value: 32.749463629599404
- type: f1
value: 64.24277142151362
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: None
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 64.705075
- type: ap
value: 59.80751870729784
- type: f1
value: 64.44356439771583
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: None
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 33.642
- type: f1
value: 33.115627459191316
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: None
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 17.852
- type: map_at_10
value: 29.279
- type: map_at_100
value: 30.55
- type: map_at_1000
value: 30.605
- type: map_at_3
value: 25.296000000000003
- type: map_at_5
value: 27.498
- type: mrr_at_1
value: 18.137
- type: mrr_at_10
value: 29.398999999999997
- type: mrr_at_100
value: 30.677
- type: mrr_at_1000
value: 30.731
- type: mrr_at_3
value: 25.427
- type: mrr_at_5
value: 27.614
- type: ndcg_at_1
value: 17.852
- type: ndcg_at_10
value: 36.071999999999996
- type: ndcg_at_100
value: 42.403
- type: ndcg_at_1000
value: 43.733
- type: ndcg_at_3
value: 27.799000000000003
- type: ndcg_at_5
value: 31.805
- type: precision_at_1
value: 17.852
- type: precision_at_10
value: 5.797
- type: precision_at_100
value: 0.878
- type: precision_at_1000
value: 0.098
- type: precision_at_3
value: 11.688
- type: precision_at_5
value: 8.976
- type: recall_at_1
value: 17.852
- type: recall_at_10
value: 57.965999999999994
- type: recall_at_100
value: 87.83800000000001
- type: recall_at_1000
value: 98.08
- type: recall_at_3
value: 35.064
- type: recall_at_5
value: 44.879000000000005
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: None
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 29.25407935159316
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: None
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 19.74540490543985
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: None
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 50.92680362916445
- type: mrr
value: 63.515697137580794
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: None
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 72.8794628935656
- type: cos_sim_spearman
value: 72.28899655141599
- type: euclidean_pearson
value: 72.84840274301827
- type: euclidean_spearman
value: 72.28899655141599
- type: manhattan_pearson
value: 72.27814398382203
- type: manhattan_spearman
value: 71.66970533201172
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: None
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 66.20129870129871
- type: f1
value: 65.02435616242589
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: None
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 28.56746746078776
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: None
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 19.212994376812908
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: None
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 17.7
- type: map_at_10
value: 23.182
- type: map_at_100
value: 24.2
- type: map_at_1000
value: 24.354
- type: map_at_3
value: 21.448
- type: map_at_5
value: 22.394
- type: mrr_at_1
value: 21.459
- type: mrr_at_10
value: 27.538
- type: mrr_at_100
value: 28.399
- type: mrr_at_1000
value: 28.479
- type: mrr_at_3
value: 25.775
- type: mrr_at_5
value: 26.705000000000002
- type: ndcg_at_1
value: 21.459
- type: ndcg_at_10
value: 26.987
- type: ndcg_at_100
value: 31.935999999999996
- type: ndcg_at_1000
value: 35.335
- type: ndcg_at_3
value: 24.214
- type: ndcg_at_5
value: 25.344
- type: precision_at_1
value: 21.459
- type: precision_at_10
value: 5.007000000000001
- type: precision_at_100
value: 0.9299999999999999
- type: precision_at_1000
value: 0.149
- type: precision_at_3
value: 11.445
- type: precision_at_5
value: 8.155
- type: recall_at_1
value: 17.7
- type: recall_at_10
value: 33.698
- type: recall_at_100
value: 55.933
- type: recall_at_1000
value: 79.567
- type: recall_at_3
value: 25.331
- type: recall_at_5
value: 28.681
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: None
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 13.008000000000001
- type: map_at_10
value: 17.331
- type: map_at_100
value: 18.128
- type: map_at_1000
value: 18.253
- type: map_at_3
value: 15.708
- type: map_at_5
value: 16.601
- type: mrr_at_1
value: 16.624
- type: mrr_at_10
value: 21.038999999999998
- type: mrr_at_100
value: 21.782
- type: mrr_at_1000
value: 21.869
- type: mrr_at_3
value: 19.320999999999998
- type: mrr_at_5
value: 20.266000000000002
- type: ndcg_at_1
value: 16.624
- type: ndcg_at_10
value: 20.584
- type: ndcg_at_100
value: 24.43
- type: ndcg_at_1000
value: 27.486
- type: ndcg_at_3
value: 17.724999999999998
- type: ndcg_at_5
value: 18.990000000000002
- type: precision_at_1
value: 16.624
- type: precision_at_10
value: 3.8850000000000002
- type: precision_at_100
value: 0.7250000000000001
- type: precision_at_1000
value: 0.122
- type: precision_at_3
value: 8.514
- type: precision_at_5
value: 6.204
- type: recall_at_1
value: 13.008000000000001
- type: recall_at_10
value: 26.799
- type: recall_at_100
value: 43.802
- type: recall_at_1000
value: 65.035
- type: recall_at_3
value: 18.411
- type: recall_at_5
value: 21.887999999999998
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: None
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 18.459
- type: map_at_10
value: 24.775
- type: map_at_100
value: 25.691999999999997
- type: map_at_1000
value: 25.802999999999997
- type: map_at_3
value: 22.784
- type: map_at_5
value: 23.764
- type: mrr_at_1
value: 21.379
- type: mrr_at_10
value: 27.555000000000003
- type: mrr_at_100
value: 28.355000000000004
- type: mrr_at_1000
value: 28.438999999999997
- type: mrr_at_3
value: 25.663999999999998
- type: mrr_at_5
value: 26.598
- type: ndcg_at_1
value: 21.379
- type: ndcg_at_10
value: 28.691
- type: ndcg_at_100
value: 33.387
- type: ndcg_at_1000
value: 36.299
- type: ndcg_at_3
value: 24.883
- type: ndcg_at_5
value: 26.438
- type: precision_at_1
value: 21.379
- type: precision_at_10
value: 4.777
- type: precision_at_100
value: 0.7799999999999999
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 11.16
- type: precision_at_5
value: 7.7490000000000006
- type: recall_at_1
value: 18.459
- type: recall_at_10
value: 37.964999999999996
- type: recall_at_100
value: 59.728
- type: recall_at_1000
value: 81.351
- type: recall_at_3
value: 27.538
- type: recall_at_5
value: 31.464
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: None
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 8.324
- type: map_at_10
value: 10.779
- type: map_at_100
value: 11.371
- type: map_at_1000
value: 11.466999999999999
- type: map_at_3
value: 9.922
- type: map_at_5
value: 10.319
- type: mrr_at_1
value: 9.153
- type: mrr_at_10
value: 11.700000000000001
- type: mrr_at_100
value: 12.314
- type: mrr_at_1000
value: 12.406
- type: mrr_at_3
value: 10.81
- type: mrr_at_5
value: 11.234
- type: ndcg_at_1
value: 9.153
- type: ndcg_at_10
value: 12.472
- type: ndcg_at_100
value: 15.942
- type: ndcg_at_1000
value: 19.118
- type: ndcg_at_3
value: 10.644
- type: ndcg_at_5
value: 11.355
- type: precision_at_1
value: 9.153
- type: precision_at_10
value: 1.921
- type: precision_at_100
value: 0.391
- type: precision_at_1000
value: 0.07100000000000001
- type: precision_at_3
value: 4.444
- type: precision_at_5
value: 3.073
- type: recall_at_1
value: 8.324
- type: recall_at_10
value: 16.971
- type: recall_at_100
value: 34.041
- type: recall_at_1000
value: 59.45399999999999
- type: recall_at_3
value: 11.77
- type: recall_at_5
value: 13.522
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: None
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 3.998
- type: map_at_10
value: 6.22
- type: map_at_100
value: 6.687
- type: map_at_1000
value: 6.796
- type: map_at_3
value: 5.124
- type: map_at_5
value: 5.705
- type: mrr_at_1
value: 5.224
- type: mrr_at_10
value: 7.915
- type: mrr_at_100
value: 8.433
- type: mrr_at_1000
value: 8.530999999999999
- type: mrr_at_3
value: 6.654
- type: mrr_at_5
value: 7.276000000000001
- type: ndcg_at_1
value: 5.224
- type: ndcg_at_10
value: 8.238
- type: ndcg_at_100
value: 11.126999999999999
- type: ndcg_at_1000
value: 14.552999999999999
- type: ndcg_at_3
value: 6.0249999999999995
- type: ndcg_at_5
value: 6.981999999999999
- type: precision_at_1
value: 5.224
- type: precision_at_10
value: 1.7160000000000002
- type: precision_at_100
value: 0.371
- type: precision_at_1000
value: 0.078
- type: precision_at_3
value: 2.9850000000000003
- type: precision_at_5
value: 2.413
- type: recall_at_1
value: 3.998
- type: recall_at_10
value: 12.995999999999999
- type: recall_at_100
value: 26.819
- type: recall_at_1000
value: 52.608
- type: recall_at_3
value: 6.721000000000001
- type: recall_at_5
value: 9.198
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: None
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 12.331
- type: map_at_10
value: 16.913
- type: map_at_100
value: 17.841
- type: map_at_1000
value: 17.977
- type: map_at_3
value: 15.633
- type: map_at_5
value: 16.256
- type: mrr_at_1
value: 15.110999999999999
- type: mrr_at_10
value: 20.419999999999998
- type: mrr_at_100
value: 21.294
- type: mrr_at_1000
value: 21.386
- type: mrr_at_3
value: 18.961
- type: mrr_at_5
value: 19.682
- type: ndcg_at_1
value: 15.110999999999999
- type: ndcg_at_10
value: 20.115
- type: ndcg_at_100
value: 24.914
- type: ndcg_at_1000
value: 28.375
- type: ndcg_at_3
value: 17.732
- type: ndcg_at_5
value: 18.658
- type: precision_at_1
value: 15.110999999999999
- type: precision_at_10
value: 3.696
- type: precision_at_100
value: 0.762
- type: precision_at_1000
value: 0.125
- type: precision_at_3
value: 8.566
- type: precision_at_5
value: 5.9670000000000005
- type: recall_at_1
value: 12.331
- type: recall_at_10
value: 26.429000000000002
- type: recall_at_100
value: 47.341
- type: recall_at_1000
value: 72.149
- type: recall_at_3
value: 19.467000000000002
- type: recall_at_5
value: 21.981
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: None
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 8.262
- type: map_at_10
value: 11.962
- type: map_at_100
value: 12.729
- type: map_at_1000
value: 12.86
- type: map_at_3
value: 10.65
- type: map_at_5
value: 11.388
- type: mrr_at_1
value: 10.502
- type: mrr_at_10
value: 14.715
- type: mrr_at_100
value: 15.484
- type: mrr_at_1000
value: 15.581999999999999
- type: mrr_at_3
value: 13.299
- type: mrr_at_5
value: 14.097999999999999
- type: ndcg_at_1
value: 10.502
- type: ndcg_at_10
value: 14.649000000000001
- type: ndcg_at_100
value: 18.738
- type: ndcg_at_1000
value: 22.456
- type: ndcg_at_3
value: 12.222
- type: ndcg_at_5
value: 13.314
- type: precision_at_1
value: 10.502
- type: precision_at_10
value: 2.82
- type: precision_at_100
value: 0.588
- type: precision_at_1000
value: 0.108
- type: precision_at_3
value: 5.936
- type: precision_at_5
value: 4.452
- type: recall_at_1
value: 8.262
- type: recall_at_10
value: 20.168
- type: recall_at_100
value: 38.405
- type: recall_at_1000
value: 65.694
- type: recall_at_3
value: 13.428999999999998
- type: recall_at_5
value: 16.229
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: mteb/cqadupstack
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 10.117416666666665
- type: map_at_10
value: 13.858333333333334
- type: map_at_100
value: 14.565166666666668
- type: map_at_1000
value: 14.68266666666667
- type: map_at_3
value: 12.60983333333333
- type: map_at_5
value: 13.277416666666667
- type: mrr_at_1
value: 12.332833333333335
- type: mrr_at_10
value: 16.376333333333335
- type: mrr_at_100
value: 17.063333333333333
- type: mrr_at_1000
value: 17.1535
- type: mrr_at_3
value: 15.040666666666667
- type: mrr_at_5
value: 15.764833333333334
- type: ndcg_at_1
value: 12.332833333333335
- type: ndcg_at_10
value: 16.51366666666667
- type: ndcg_at_100
value: 20.2845
- type: ndcg_at_1000
value: 23.54025
- type: ndcg_at_3
value: 14.171250000000002
- type: ndcg_at_5
value: 15.193583333333333
- type: precision_at_1
value: 12.332833333333335
- type: precision_at_10
value: 2.983083333333333
- type: precision_at_100
value: 0.58325
- type: precision_at_1000
value: 0.10250000000000001
- type: precision_at_3
value: 6.626083333333334
- type: precision_at_5
value: 4.774916666666665
- type: recall_at_1
value: 10.117416666666665
- type: recall_at_10
value: 22.14666666666667
- type: recall_at_100
value: 39.5745
- type: recall_at_1000
value: 63.73550000000001
- type: recall_at_3
value: 15.431666666666665
- type: recall_at_5
value: 18.1215
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: None
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 7.431
- type: map_at_10
value: 10.172
- type: map_at_100
value: 10.639999999999999
- type: map_at_1000
value: 10.716000000000001
- type: map_at_3
value: 9.242
- type: map_at_5
value: 9.614
- type: mrr_at_1
value: 9.202
- type: mrr_at_10
value: 12.08
- type: mrr_at_100
value: 12.58
- type: mrr_at_1000
value: 12.649
- type: mrr_at_3
value: 11.145
- type: mrr_at_5
value: 11.59
- type: ndcg_at_1
value: 9.202
- type: ndcg_at_10
value: 12.291
- type: ndcg_at_100
value: 14.940999999999999
- type: ndcg_at_1000
value: 17.325
- type: ndcg_at_3
value: 10.446
- type: ndcg_at_5
value: 11.027000000000001
- type: precision_at_1
value: 9.202
- type: precision_at_10
value: 2.193
- type: precision_at_100
value: 0.388
- type: precision_at_1000
value: 0.065
- type: precision_at_3
value: 4.806
- type: precision_at_5
value: 3.374
- type: recall_at_1
value: 7.431
- type: recall_at_10
value: 17.197000000000003
- type: recall_at_100
value: 29.704000000000004
- type: recall_at_1000
value: 48.278999999999996
- type: recall_at_3
value: 11.616999999999999
- type: recall_at_5
value: 13.181000000000001
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: None
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 5.348
- type: map_at_10
value: 7.591
- type: map_at_100
value: 8.109
- type: map_at_1000
value: 8.206
- type: map_at_3
value: 6.782000000000001
- type: map_at_5
value: 7.244000000000001
- type: mrr_at_1
value: 6.641
- type: mrr_at_10
value: 9.281
- type: mrr_at_100
value: 9.838
- type: mrr_at_1000
value: 9.922
- type: mrr_at_3
value: 8.286999999999999
- type: mrr_at_5
value: 8.866999999999999
- type: ndcg_at_1
value: 6.641
- type: ndcg_at_10
value: 9.302000000000001
- type: ndcg_at_100
value: 12.200999999999999
- type: ndcg_at_1000
value: 15.223999999999998
- type: ndcg_at_3
value: 7.692
- type: ndcg_at_5
value: 8.474
- type: precision_at_1
value: 6.641
- type: precision_at_10
value: 1.755
- type: precision_at_100
value: 0.388
- type: precision_at_1000
value: 0.079
- type: precision_at_3
value: 3.6249999999999996
- type: precision_at_5
value: 2.753
- type: recall_at_1
value: 5.348
- type: recall_at_10
value: 12.887
- type: recall_at_100
value: 26.391
- type: recall_at_1000
value: 49.156
- type: recall_at_3
value: 8.519
- type: recall_at_5
value: 10.431
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: None
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 7.9750000000000005
- type: map_at_10
value: 11.28
- type: map_at_100
value: 11.953
- type: map_at_1000
value: 12.051
- type: map_at_3
value: 10.022
- type: map_at_5
value: 10.807
- type: mrr_at_1
value: 9.795
- type: mrr_at_10
value: 13.544999999999998
- type: mrr_at_100
value: 14.249999999999998
- type: mrr_at_1000
value: 14.341000000000001
- type: mrr_at_3
value: 12.174
- type: mrr_at_5
value: 13.041
- type: ndcg_at_1
value: 9.795
- type: ndcg_at_10
value: 13.697000000000001
- type: ndcg_at_100
value: 17.389
- type: ndcg_at_1000
value: 20.46
- type: ndcg_at_3
value: 11.277
- type: ndcg_at_5
value: 12.579
- type: precision_at_1
value: 9.795
- type: precision_at_10
value: 2.435
- type: precision_at_100
value: 0.481
- type: precision_at_1000
value: 0.084
- type: precision_at_3
value: 5.255
- type: precision_at_5
value: 3.955
- type: recall_at_1
value: 7.9750000000000005
- type: recall_at_10
value: 18.981
- type: recall_at_100
value: 36.178
- type: recall_at_1000
value: 59.46900000000001
- type: recall_at_3
value: 12.371
- type: recall_at_5
value: 15.613
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: None
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 10.742
- type: map_at_10
value: 15.346000000000002
- type: map_at_100
value: 16.153000000000002
- type: map_at_1000
value: 16.311999999999998
- type: map_at_3
value: 14.222999999999999
- type: map_at_5
value: 14.777000000000001
- type: mrr_at_1
value: 14.032
- type: mrr_at_10
value: 18.83
- type: mrr_at_100
value: 19.564999999999998
- type: mrr_at_1000
value: 19.655
- type: mrr_at_3
value: 17.523
- type: mrr_at_5
value: 18.244
- type: ndcg_at_1
value: 14.032
- type: ndcg_at_10
value: 18.496000000000002
- type: ndcg_at_100
value: 22.377
- type: ndcg_at_1000
value: 26.284000000000002
- type: ndcg_at_3
value: 16.520000000000003
- type: ndcg_at_5
value: 17.276
- type: precision_at_1
value: 14.032
- type: precision_at_10
value: 3.5770000000000004
- type: precision_at_100
value: 0.783
- type: precision_at_1000
value: 0.16
- type: precision_at_3
value: 7.971
- type: precision_at_5
value: 5.692
- type: recall_at_1
value: 10.742
- type: recall_at_10
value: 24.157999999999998
- type: recall_at_100
value: 42.091
- type: recall_at_1000
value: 70.054
- type: recall_at_3
value: 17.916999999999998
- type: recall_at_5
value: 20.131
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval
type: None
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 7.831
- type: map_at_10
value: 10.749
- type: map_at_100
value: 11.279
- type: map_at_1000
value: 11.397
- type: map_at_3
value: 9.78
- type: map_at_5
value: 10.459999999999999
- type: mrr_at_1
value: 8.872
- type: mrr_at_10
value: 11.898
- type: mrr_at_100
value: 12.466000000000001
- type: mrr_at_1000
value: 12.583
- type: mrr_at_3
value: 10.875
- type: mrr_at_5
value: 11.577
- type: ndcg_at_1
value: 8.872
- type: ndcg_at_10
value: 12.642000000000001
- type: ndcg_at_100
value: 16.032
- type: ndcg_at_1000
value: 19.567999999999998
- type: ndcg_at_3
value: 10.674999999999999
- type: ndcg_at_5
value: 11.886
- type: precision_at_1
value: 8.872
- type: precision_at_10
value: 2.015
- type: precision_at_100
value: 0.41200000000000003
- type: precision_at_1000
value: 0.077
- type: precision_at_3
value: 4.806
- type: precision_at_5
value: 3.512
- type: recall_at_1
value: 7.831
- type: recall_at_10
value: 17.511
- type: recall_at_100
value: 34.461000000000006
- type: recall_at_1000
value: 62.01
- type: recall_at_3
value: 12.089
- type: recall_at_5
value: 15.139
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: None
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 3.3300000000000005
- type: map_at_10
value: 5.8709999999999996
- type: map_at_100
value: 6.7860000000000005
- type: map_at_1000
value: 6.955
- type: map_at_3
value: 4.714
- type: map_at_5
value: 5.26
- type: mrr_at_1
value: 7.101
- type: mrr_at_10
value: 12.125
- type: mrr_at_100
value: 13.200000000000001
- type: mrr_at_1000
value: 13.295000000000002
- type: mrr_at_3
value: 10.119
- type: mrr_at_5
value: 11.038
- type: ndcg_at_1
value: 7.101
- type: ndcg_at_10
value: 9.159
- type: ndcg_at_100
value: 14.030000000000001
- type: ndcg_at_1000
value: 18.013
- type: ndcg_at_3
value: 6.6739999999999995
- type: ndcg_at_5
value: 7.4719999999999995
- type: precision_at_1
value: 7.101
- type: precision_at_10
value: 3.16
- type: precision_at_100
value: 0.84
- type: precision_at_1000
value: 0.156
- type: precision_at_3
value: 5.081
- type: precision_at_5
value: 4.143
- type: recall_at_1
value: 3.3300000000000005
- type: recall_at_10
value: 12.215
- type: recall_at_100
value: 29.683999999999997
- type: recall_at_1000
value: 52.951
- type: recall_at_3
value: 6.356000000000001
- type: recall_at_5
value: 8.315
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: None
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 1.718
- type: map_at_10
value: 3.639
- type: map_at_100
value: 4.853
- type: map_at_1000
value: 5.219
- type: map_at_3
value: 2.6149999999999998
- type: map_at_5
value: 3.073
- type: mrr_at_1
value: 20.0
- type: mrr_at_10
value: 26.88
- type: mrr_at_100
value: 27.753
- type: mrr_at_1000
value: 27.822000000000003
- type: mrr_at_3
value: 24.667
- type: mrr_at_5
value: 25.654
- type: ndcg_at_1
value: 15.0
- type: ndcg_at_10
value: 10.878
- type: ndcg_at_100
value: 12.011
- type: ndcg_at_1000
value: 16.492
- type: ndcg_at_3
value: 12.818999999999999
- type: ndcg_at_5
value: 11.554
- type: precision_at_1
value: 20.0
- type: precision_at_10
value: 9.625
- type: precision_at_100
value: 3.037
- type: precision_at_1000
value: 0.7080000000000001
- type: precision_at_3
value: 15.082999999999998
- type: precision_at_5
value: 12.1
- type: recall_at_1
value: 1.718
- type: recall_at_10
value: 5.716
- type: recall_at_100
value: 14.266000000000002
- type: recall_at_1000
value: 30.012
- type: recall_at_3
value: 3.108
- type: recall_at_5
value: 4.181
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: None
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 41.114999999999995
- type: f1
value: 37.00141090816854
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: None
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 5.523
- type: map_at_10
value: 8.036
- type: map_at_100
value: 8.581999999999999
- type: map_at_1000
value: 8.657
- type: map_at_3
value: 7.13
- type: map_at_5
value: 7.536
- type: mrr_at_1
value: 5.836
- type: mrr_at_10
value: 8.547
- type: mrr_at_100
value: 9.123000000000001
- type: mrr_at_1000
value: 9.197
- type: mrr_at_3
value: 7.563000000000001
- type: mrr_at_5
value: 8.006
- type: ndcg_at_1
value: 5.836
- type: ndcg_at_10
value: 9.764000000000001
- type: ndcg_at_100
value: 12.866
- type: ndcg_at_1000
value: 15.243
- type: ndcg_at_3
value: 7.7700000000000005
- type: ndcg_at_5
value: 8.518
- type: precision_at_1
value: 5.836
- type: precision_at_10
value: 1.6070000000000002
- type: precision_at_100
value: 0.331
- type: precision_at_1000
value: 0.055
- type: precision_at_3
value: 3.2849999999999997
- type: precision_at_5
value: 2.37
- type: recall_at_1
value: 5.523
- type: recall_at_10
value: 14.795
- type: recall_at_100
value: 29.932
- type: recall_at_1000
value: 48.946
- type: recall_at_3
value: 9.208
- type: recall_at_5
value: 10.984
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: None
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 4.135
- type: map_at_10
value: 6.433999999999999
- type: map_at_100
value: 7.196
- type: map_at_1000
value: 7.356999999999999
- type: map_at_3
value: 5.339
- type: map_at_5
value: 5.878
- type: mrr_at_1
value: 8.796
- type: mrr_at_10
value: 12.357999999999999
- type: mrr_at_100
value: 13.208
- type: mrr_at_1000
value: 13.318
- type: mrr_at_3
value: 10.777000000000001
- type: mrr_at_5
value: 11.525
- type: ndcg_at_1
value: 8.796
- type: ndcg_at_10
value: 9.332
- type: ndcg_at_100
value: 13.517999999999999
- type: ndcg_at_1000
value: 17.907999999999998
- type: ndcg_at_3
value: 7.481999999999999
- type: ndcg_at_5
value: 8.065
- type: precision_at_1
value: 8.796
- type: precision_at_10
value: 2.8240000000000003
- type: precision_at_100
value: 0.705
- type: precision_at_1000
value: 0.14400000000000002
- type: precision_at_3
value: 4.887
- type: precision_at_5
value: 3.8580000000000005
- type: recall_at_1
value: 4.135
- type: recall_at_10
value: 12.292
- type: recall_at_100
value: 28.915999999999997
- type: recall_at_1000
value: 57.477999999999994
- type: recall_at_3
value: 6.747
- type: recall_at_5
value: 8.667
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: None
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 5.928
- type: map_at_10
value: 8.469
- type: map_at_100
value: 8.936
- type: map_at_1000
value: 9.02
- type: map_at_3
value: 7.582
- type: map_at_5
value: 8.021
- type: mrr_at_1
value: 11.857
- type: mrr_at_10
value: 15.675
- type: mrr_at_100
value: 16.273
- type: mrr_at_1000
value: 16.356
- type: mrr_at_3
value: 14.347999999999999
- type: mrr_at_5
value: 14.995
- type: ndcg_at_1
value: 11.857
- type: ndcg_at_10
value: 11.651
- type: ndcg_at_100
value: 14.374999999999998
- type: ndcg_at_1000
value: 16.912
- type: ndcg_at_3
value: 9.625
- type: ndcg_at_5
value: 10.474
- type: precision_at_1
value: 11.857
- type: precision_at_10
value: 2.777
- type: precision_at_100
value: 0.503
- type: precision_at_1000
value: 0.08499999999999999
- type: precision_at_3
value: 6.140000000000001
- type: precision_at_5
value: 4.362
- type: recall_at_1
value: 5.928
- type: recall_at_10
value: 13.883000000000001
- type: recall_at_100
value: 25.137999999999998
- type: recall_at_1000
value: 42.315999999999995
- type: recall_at_3
value: 9.21
- type: recall_at_5
value: 10.905
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: None
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 65.4388
- type: ap
value: 60.440774024423426
- type: f1
value: 65.31315753102281
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: None
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: map_at_1
value: 3.4479999999999995
- type: map_at_10
value: 5.74
- type: map_at_100
value: 6.2780000000000005
- type: map_at_1000
value: 6.358999999999999
- type: map_at_3
value: 4.82
- type: map_at_5
value: 5.3
- type: mrr_at_1
value: 3.5389999999999997
- type: mrr_at_10
value: 5.906000000000001
- type: mrr_at_100
value: 6.455
- type: mrr_at_1000
value: 6.5360000000000005
- type: mrr_at_3
value: 4.9639999999999995
- type: mrr_at_5
value: 5.453
- type: ndcg_at_1
value: 3.5389999999999997
- type: ndcg_at_10
value: 7.255000000000001
- type: ndcg_at_100
value: 10.308
- type: ndcg_at_1000
value: 12.93
- type: ndcg_at_3
value: 5.314
- type: ndcg_at_5
value: 6.184
- type: precision_at_1
value: 3.5389999999999997
- type: precision_at_10
value: 1.246
- type: precision_at_100
value: 0.28500000000000003
- type: precision_at_1000
value: 0.051000000000000004
- type: precision_at_3
value: 2.297
- type: precision_at_5
value: 1.814
- type: recall_at_1
value: 3.4479999999999995
- type: recall_at_10
value: 11.982
- type: recall_at_100
value: 27.123
- type: recall_at_1000
value: 48.489
- type: recall_at_3
value: 6.607
- type: recall_at_5
value: 8.706
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: None
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 85.9484724122207
- type: f1
value: 85.39768490584245
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: None
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 58.48837209302326
- type: f1
value: 39.10849416181491
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: None
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 60.632145258910555
- type: f1
value: 58.09773014884143
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: None
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 67.68325487558843
- type: f1
value: 65.91204845805859
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: None
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 26.41069242141184
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: None
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 23.307848920918044
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: None
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 28.270878365120332
- type: mrr
value: 29.057926505909254
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: None
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 1.855
- type: map_at_10
value: 3.582
- type: map_at_100
value: 4.694
- type: map_at_1000
value: 5.739
- type: map_at_3
value: 2.677
- type: map_at_5
value: 3.1
- type: mrr_at_1
value: 18.884999999999998
- type: mrr_at_10
value: 27.256999999999998
- type: mrr_at_100
value: 28.327999999999996
- type: mrr_at_1000
value: 28.402
- type: mrr_at_3
value: 24.2
- type: mrr_at_5
value: 26.011
- type: ndcg_at_1
value: 17.957
- type: ndcg_at_10
value: 14.051
- type: ndcg_at_100
value: 14.282
- type: ndcg_at_1000
value: 24.3
- type: ndcg_at_3
value: 15.478
- type: ndcg_at_5
value: 14.782
- type: precision_at_1
value: 18.884999999999998
- type: precision_at_10
value: 10.743
- type: precision_at_100
value: 4.449
- type: precision_at_1000
value: 1.7670000000000001
- type: precision_at_3
value: 14.654
- type: precision_at_5
value: 12.940999999999999
- type: recall_at_1
value: 1.855
- type: recall_at_10
value: 6.861000000000001
- type: recall_at_100
value: 18.044
- type: recall_at_1000
value: 52.712
- type: recall_at_3
value: 3.3369999999999997
- type: recall_at_5
value: 4.562
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: None
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 4.881
- type: map_at_10
value: 8.241999999999999
- type: map_at_100
value: 8.956999999999999
- type: map_at_1000
value: 9.062000000000001
- type: map_at_3
value: 6.981
- type: map_at_5
value: 7.61
- type: mrr_at_1
value: 5.5329999999999995
- type: mrr_at_10
value: 9.184000000000001
- type: mrr_at_100
value: 9.918000000000001
- type: mrr_at_1000
value: 10.018
- type: mrr_at_3
value: 7.836
- type: mrr_at_5
value: 8.518
- type: ndcg_at_1
value: 5.5329999999999995
- type: ndcg_at_10
value: 10.554
- type: ndcg_at_100
value: 14.341999999999999
- type: ndcg_at_1000
value: 17.458000000000002
- type: ndcg_at_3
value: 7.8759999999999994
- type: ndcg_at_5
value: 9.023
- type: precision_at_1
value: 5.5329999999999995
- type: precision_at_10
value: 1.944
- type: precision_at_100
value: 0.411
- type: precision_at_1000
value: 0.07100000000000001
- type: precision_at_3
value: 3.669
- type: precision_at_5
value: 2.8160000000000003
- type: recall_at_1
value: 4.881
- type: recall_at_10
value: 16.898
- type: recall_at_100
value: 34.625
- type: recall_at_1000
value: 58.901
- type: recall_at_3
value: 9.651
- type: recall_at_5
value: 12.35
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: None
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 53.159
- type: map_at_10
value: 64.053
- type: map_at_100
value: 64.938
- type: map_at_1000
value: 64.994
- type: map_at_3
value: 61.413
- type: map_at_5
value: 62.966
- type: mrr_at_1
value: 61.129999999999995
- type: mrr_at_10
value: 68.84400000000001
- type: mrr_at_100
value: 69.3
- type: mrr_at_1000
value: 69.319
- type: mrr_at_3
value: 67.113
- type: mrr_at_5
value: 68.162
- type: ndcg_at_1
value: 61.160000000000004
- type: ndcg_at_10
value: 68.944
- type: ndcg_at_100
value: 72.10499999999999
- type: ndcg_at_1000
value: 73.046
- type: ndcg_at_3
value: 65.223
- type: ndcg_at_5
value: 67.05
- type: precision_at_1
value: 61.160000000000004
- type: precision_at_10
value: 10.392999999999999
- type: precision_at_100
value: 1.327
- type: precision_at_1000
value: 0.149
- type: precision_at_3
value: 28.13
- type: precision_at_5
value: 18.656
- type: recall_at_1
value: 53.159
- type: recall_at_10
value: 78.412
- type: recall_at_100
value: 91.399
- type: recall_at_1000
value: 97.52
- type: recall_at_3
value: 67.794
- type: recall_at_5
value: 72.801
- type: map_at_1
value: 1.8450000000000002
- type: map_at_10
value: 4.172
- type: map_at_100
value: 5.092
- type: map_at_1000
value: 5.3100000000000005
- type: map_at_3
value: 3.093
- type: map_at_5
value: 3.6450000000000005
- type: mrr_at_1
value: 9.1
- type: mrr_at_10
value: 15.15
- type: mrr_at_100
value: 16.216
- type: mrr_at_1000
value: 16.332
- type: mrr_at_3
value: 12.55
- type: mrr_at_5
value: 13.975000000000001
- type: ndcg_at_1
value: 9.1
- type: ndcg_at_10
value: 8.065999999999999
- type: ndcg_at_100
value: 12.982
- type: ndcg_at_1000
value: 18.046
- type: ndcg_at_3
value: 7.295999999999999
- type: ndcg_at_5
value: 6.572
- type: precision_at_1
value: 9.1
- type: precision_at_10
value: 4.29
- type: precision_at_100
value: 1.16
- type: precision_at_1000
value: 0.23900000000000002
- type: precision_at_3
value: 6.833
- type: precision_at_5
value: 5.88
- type: recall_at_1
value: 1.8450000000000002
- type: recall_at_10
value: 8.706999999999999
- type: recall_at_100
value: 23.645
- type: recall_at_1000
value: 48.597
- type: recall_at_3
value: 4.175
- type: recall_at_5
value: 5.973
- type: map_at_1
value: 0.058
- type: map_at_10
value: 0.445
- type: map_at_100
value: 2.489
- type: map_at_1000
value: 6.3100000000000005
- type: map_at_3
value: 0.16999999999999998
- type: map_at_5
value: 0.254
- type: mrr_at_1
value: 32.0
- type: mrr_at_10
value: 46.016
- type: mrr_at_100
value: 46.683
- type: mrr_at_1000
value: 46.719
- type: mrr_at_3
value: 41.667
- type: mrr_at_5
value: 42.967
- type: ndcg_at_1
value: 26.0
- type: ndcg_at_10
value: 29.885
- type: ndcg_at_100
value: 22.958000000000002
- type: ndcg_at_1000
value: 22.244
- type: ndcg_at_3
value: 29.787999999999997
- type: ndcg_at_5
value: 29.494999999999997
- type: precision_at_1
value: 32.0
- type: precision_at_10
value: 33.800000000000004
- type: precision_at_100
value: 24.52
- type: precision_at_1000
value: 11.196
- type: precision_at_3
value: 35.333
- type: precision_at_5
value: 34.0
- type: recall_at_1
value: 0.058
- type: recall_at_10
value: 0.657
- type: recall_at_100
value: 5.069
- type: recall_at_1000
value: 22.447
- type: recall_at_3
value: 0.2
- type: recall_at_5
value: 0.32299999999999995
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: None
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 30.140589231842256
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: None
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 39.92770613505385
- task:
type: STS
dataset:
name: MTEB SICK-R
type: None
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 75.59024815989618
- type: cos_sim_spearman
value: 68.11624653233133
- type: euclidean_pearson
value: 73.27920094980502
- type: euclidean_spearman
value: 68.11632959681863
- type: manhattan_pearson
value: 72.54935141266294
- type: manhattan_spearman
value: 67.12457070604133
- task:
type: STS
dataset:
name: MTEB STS12
type: None
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 69.40126270570799
- type: cos_sim_spearman
value: 62.14207404840335
- type: euclidean_pearson
value: 66.27602017682412
- type: euclidean_spearman
value: 62.143384728461314
- type: manhattan_pearson
value: 67.07706053549664
- type: manhattan_spearman
value: 63.06497657163255
- task:
type: STS
dataset:
name: MTEB STS13
type: None
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 75.5989515866992
- type: cos_sim_spearman
value: 77.15211512453997
- type: euclidean_pearson
value: 76.70296919445704
- type: euclidean_spearman
value: 77.15215294384531
- type: manhattan_pearson
value: 77.00183340244841
- type: manhattan_spearman
value: 77.54347126493187
- task:
type: STS
dataset:
name: MTEB STS14
type: None
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 73.76592708615566
- type: cos_sim_spearman
value: 70.57102535486983
- type: euclidean_pearson
value: 73.16493844323281
- type: euclidean_spearman
value: 70.57101566858893
- type: manhattan_pearson
value: 73.3644832097739
- type: manhattan_spearman
value: 70.93527541966915
- task:
type: STS
dataset:
name: MTEB STS15
type: None
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 75.95076880553377
- type: cos_sim_spearman
value: 77.68458699868269
- type: euclidean_pearson
value: 77.7470713475935
- type: euclidean_spearman
value: 77.6845933113232
- type: manhattan_pearson
value: 78.19369618957612
- type: manhattan_spearman
value: 78.11088657087784
- task:
type: STS
dataset:
name: MTEB STS16
type: None
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 71.9715763028299
- type: cos_sim_spearman
value: 73.53220647955904
- type: euclidean_pearson
value: 73.57406594330985
- type: euclidean_spearman
value: 73.53303581777323
- type: manhattan_pearson
value: 74.03967460920595
- type: manhattan_spearman
value: 74.05778553630698
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: None
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 78.73667148725723
- type: cos_sim_spearman
value: 80.81028828869353
- type: euclidean_pearson
value: 81.15810431179573
- type: euclidean_spearman
value: 80.81116429309112
- type: manhattan_pearson
value: 81.55719120035107
- type: manhattan_spearman
value: 81.20882260152872
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: None
config: en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 61.43534524580482
- type: cos_sim_spearman
value: 59.839157733781434
- type: euclidean_pearson
value: 61.83093863698779
- type: euclidean_spearman
value: 59.839157733781434
- type: manhattan_pearson
value: 62.55988010471628
- type: manhattan_spearman
value: 60.30306061143011
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: None
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 72.25188934839379
- type: cos_sim_spearman
value: 70.9113050369473
- type: euclidean_pearson
value: 72.68710352046212
- type: euclidean_spearman
value: 70.9113534378691
- type: manhattan_pearson
value: 73.09745859415004
- type: manhattan_spearman
value: 71.26505067192102
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: None
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 67.5036392977626
- type: mrr
value: 87.43891003694925
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: None
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 20.889
- type: map_at_10
value: 27.165
- type: map_at_100
value: 28.368
- type: map_at_1000
value: 28.483999999999998
- type: map_at_3
value: 25.180999999999997
- type: map_at_5
value: 26.269
- type: mrr_at_1
value: 22.0
- type: mrr_at_10
value: 28.512999999999998
- type: mrr_at_100
value: 29.531000000000002
- type: mrr_at_1000
value: 29.635
- type: mrr_at_3
value: 26.611
- type: mrr_at_5
value: 27.594
- type: ndcg_at_1
value: 22.0
- type: ndcg_at_10
value: 30.814000000000004
- type: ndcg_at_100
value: 36.647999999999996
- type: ndcg_at_1000
value: 39.81
- type: ndcg_at_3
value: 26.845999999999997
- type: ndcg_at_5
value: 28.677999999999997
- type: precision_at_1
value: 22.0
- type: precision_at_10
value: 4.5
- type: precision_at_100
value: 0.773
- type: precision_at_1000
value: 0.105
- type: precision_at_3
value: 10.778
- type: precision_at_5
value: 7.5329999999999995
- type: recall_at_1
value: 20.889
- type: recall_at_10
value: 40.861
- type: recall_at_100
value: 68.089
- type: recall_at_1000
value: 93.05
- type: recall_at_3
value: 30.083
- type: recall_at_5
value: 34.556
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: None
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.47524752475248
- type: cos_sim_ap
value: 75.756486791625
- type: cos_sim_f1
value: 70.0162074554295
- type: cos_sim_precision
value: 76.14571092831962
- type: cos_sim_recall
value: 64.8
- type: dot_accuracy
value: 99.47524752475248
- type: dot_ap
value: 75.756486791625
- type: dot_f1
value: 70.0162074554295
- type: dot_precision
value: 76.14571092831962
- type: dot_recall
value: 64.8
- type: euclidean_accuracy
value: 99.47524752475248
- type: euclidean_ap
value: 75.756486791625
- type: euclidean_f1
value: 70.0162074554295
- type: euclidean_precision
value: 76.14571092831962
- type: euclidean_recall
value: 64.8
- type: manhattan_accuracy
value: 99.53069306930693
- type: manhattan_ap
value: 78.93311079752957
- type: manhattan_f1
value: 72.61292166952545
- type: manhattan_precision
value: 84.77970627503338
- type: manhattan_recall
value: 63.5
- type: max_accuracy
value: 99.53069306930693
- type: max_ap
value: 78.93311079752957
- type: max_f1
value: 72.61292166952545
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: None
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 38.956591584917824
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: None
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 28.829387041051085
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: None
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 41.618168302388256
- type: mrr
value: 42.031210211357276
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: None
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 29.716182681356333
- type: cos_sim_spearman
value: 28.852160879670087
- type: dot_pearson
value: 29.716182648715844
- type: dot_spearman
value: 28.951026187665967
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: None
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 2.157
- type: map_at_10
value: 6.787999999999999
- type: map_at_100
value: 9.948
- type: map_at_1000
value: 11.331
- type: map_at_3
value: 4.642
- type: map_at_5
value: 5.718999999999999
- type: mrr_at_1
value: 28.571
- type: mrr_at_10
value: 39.195
- type: mrr_at_100
value: 40.778999999999996
- type: mrr_at_1000
value: 40.797
- type: mrr_at_3
value: 36.394999999999996
- type: mrr_at_5
value: 38.129000000000005
- type: ndcg_at_1
value: 28.571
- type: ndcg_at_10
value: 17.936
- type: ndcg_at_100
value: 26.552999999999997
- type: ndcg_at_1000
value: 38.318000000000005
- type: ndcg_at_3
value: 24.192
- type: ndcg_at_5
value: 21.732000000000003
- type: precision_at_1
value: 28.571
- type: precision_at_10
value: 14.285999999999998
- type: precision_at_100
value: 5.489999999999999
- type: precision_at_1000
value: 1.2710000000000001
- type: precision_at_3
value: 24.490000000000002
- type: precision_at_5
value: 20.816000000000003
- type: recall_at_1
value: 2.157
- type: recall_at_10
value: 9.729000000000001
- type: recall_at_100
value: 32.688
- type: recall_at_1000
value: 69.123
- type: recall_at_3
value: 5.26
- type: recall_at_5
value: 7.109
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: None
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 67.9134
- type: ap
value: 12.774220384041032
- type: f1
value: 52.153059662642434
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: None
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 53.613469156762875
- type: f1
value: 53.786522868566145
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: None
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 30.747359446594245
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: None
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 83.97806520832091
- type: cos_sim_ap
value: 66.35427447671117
- type: cos_sim_f1
value: 63.0426851514046
- type: cos_sim_precision
value: 58.47056169636815
- type: cos_sim_recall
value: 68.3905013192612
- type: dot_accuracy
value: 83.97806520832091
- type: dot_ap
value: 66.35427447671117
- type: dot_f1
value: 63.0426851514046
- type: dot_precision
value: 58.47056169636815
- type: dot_recall
value: 68.3905013192612
- type: euclidean_accuracy
value: 83.97806520832091
- type: euclidean_ap
value: 66.35427447671117
- type: euclidean_f1
value: 63.0426851514046
- type: euclidean_precision
value: 58.47056169636815
- type: euclidean_recall
value: 68.3905013192612
- type: manhattan_accuracy
value: 83.97210466710378
- type: manhattan_ap
value: 65.97618382203181
- type: manhattan_f1
value: 62.53991648243675
- type: manhattan_precision
value: 58.501838235294116
- type: manhattan_recall
value: 67.17678100263852
- type: max_accuracy
value: 83.97806520832091
- type: max_ap
value: 66.35427447671117
- type: max_f1
value: 63.0426851514046
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: None
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 86.71362595567975
- type: cos_sim_ap
value: 80.86796720185393
- type: cos_sim_f1
value: 73.24097703244622
- type: cos_sim_precision
value: 69.5540783824955
- type: cos_sim_recall
value: 77.34062211271944
- type: dot_accuracy
value: 86.71362595567975
- type: dot_ap
value: 80.86797238493406
- type: dot_f1
value: 73.24097703244622
- type: dot_precision
value: 69.5540783824955
- type: dot_recall
value: 77.34062211271944
- type: euclidean_accuracy
value: 86.71362595567975
- type: euclidean_ap
value: 80.86796690301992
- type: euclidean_f1
value: 73.24097703244622
- type: euclidean_precision
value: 69.5540783824955
- type: euclidean_recall
value: 77.34062211271944
- type: manhattan_accuracy
value: 86.64376916210657
- type: manhattan_ap
value: 80.8520473693602
- type: manhattan_f1
value: 73.15887850467291
- type: manhattan_precision
value: 71.10158407208255
- type: manhattan_recall
value: 75.33877425315676
- type: max_accuracy
value: 86.71362595567975
- type: max_ap
value: 80.86797238493406
- type: max_f1
value: 73.24097703244622
---
| [
"SUMMARIZATION"
]
| [
"BIOSSES",
"SCIFACT"
]
| Non_BioNLP | {"tags": ["mteb"], "model-index": [{"name": "nomic_classification_100", "results": [{"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonCounterfactualClassification (en)", "type": "None", "config": "en", "split": "test", "revision": "e8379541af4e31359cca9fbcf4b00f2671dba205"}, "metrics": [{"type": "accuracy", "value": 70.35820895522387}, {"type": "ap", "value": 32.749463629599404}, {"type": "f1", "value": 64.24277142151362}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonPolarityClassification", "type": "None", "config": "default", "split": "test", "revision": "e2d317d38cd51312af73b3d32a06d1a08b442046"}, "metrics": [{"type": "accuracy", "value": 64.705075}, {"type": "ap", "value": 59.80751870729784}, {"type": "f1", "value": 64.44356439771583}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification (en)", "type": "None", "config": "en", "split": "test", "revision": "1399c76144fd37290681b995c656ef9b2e06e26d"}, "metrics": [{"type": "accuracy", "value": 33.642}, {"type": "f1", "value": 33.115627459191316}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ArguAna", "type": "None", "config": "default", "split": "test", "revision": "c22ab2a51041ffd869aaddef7af8d8215647e41a"}, "metrics": [{"type": "map_at_1", "value": 17.852}, {"type": "map_at_10", "value": 29.279}, {"type": "map_at_100", "value": 30.55}, {"type": "map_at_1000", "value": 30.605}, {"type": "map_at_3", "value": 25.296000000000003}, {"type": "map_at_5", "value": 27.498}, {"type": "mrr_at_1", "value": 18.137}, {"type": "mrr_at_10", "value": 29.398999999999997}, {"type": "mrr_at_100", "value": 30.677}, {"type": "mrr_at_1000", "value": 30.731}, {"type": "mrr_at_3", "value": 25.427}, {"type": "mrr_at_5", "value": 27.614}, {"type": "ndcg_at_1", "value": 17.852}, {"type": "ndcg_at_10", "value": 36.071999999999996}, {"type": "ndcg_at_100", "value": 42.403}, {"type": "ndcg_at_1000", "value": 43.733}, {"type": "ndcg_at_3", "value": 27.799000000000003}, {"type": "ndcg_at_5", "value": 31.805}, {"type": "precision_at_1", "value": 17.852}, {"type": "precision_at_10", "value": 5.797}, {"type": "precision_at_100", "value": 0.878}, {"type": "precision_at_1000", "value": 0.098}, {"type": "precision_at_3", "value": 11.688}, {"type": "precision_at_5", "value": 8.976}, {"type": "recall_at_1", "value": 17.852}, {"type": "recall_at_10", "value": 57.965999999999994}, {"type": "recall_at_100", "value": 87.83800000000001}, {"type": "recall_at_1000", "value": 98.08}, {"type": "recall_at_3", "value": 35.064}, {"type": "recall_at_5", "value": 44.879000000000005}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringP2P", "type": "None", "config": "default", "split": "test", "revision": "a122ad7f3f0291bf49cc6f4d32aa80929df69d5d"}, "metrics": [{"type": "v_measure", "value": 29.25407935159316}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringS2S", "type": "None", "config": "default", "split": "test", "revision": "f910caf1a6075f7329cdf8c1a6135696f37dbd53"}, "metrics": [{"type": "v_measure", "value": 19.74540490543985}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB AskUbuntuDupQuestions", "type": "None", "config": "default", "split": "test", "revision": "2000358ca161889fa9c082cb41daa8dcfb161a54"}, "metrics": [{"type": "map", "value": 50.92680362916445}, {"type": "mrr", "value": 63.515697137580794}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB BIOSSES", "type": "None", "config": "default", "split": "test", "revision": "d3fb88f8f02e40887cd149695127462bbcf29b4a"}, "metrics": [{"type": "cos_sim_pearson", "value": 72.8794628935656}, {"type": "cos_sim_spearman", "value": 72.28899655141599}, {"type": "euclidean_pearson", "value": 72.84840274301827}, {"type": "euclidean_spearman", "value": 72.28899655141599}, {"type": "manhattan_pearson", "value": 72.27814398382203}, {"type": "manhattan_spearman", "value": 71.66970533201172}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB Banking77Classification", "type": "None", "config": "default", "split": "test", "revision": "0fd18e25b25c072e09e0d92ab615fda904d66300"}, "metrics": [{"type": "accuracy", "value": 66.20129870129871}, {"type": "f1", "value": 65.02435616242589}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringP2P", "type": "None", "config": "default", "split": "test", "revision": "65b79d1d13f80053f67aca9498d9402c2d9f1f40"}, "metrics": [{"type": "v_measure", "value": 28.56746746078776}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringS2S", "type": "None", "config": "default", "split": "test", "revision": "258694dd0231531bc1fd9de6ceb52a0853c6d908"}, "metrics": [{"type": "v_measure", "value": 19.212994376812908}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackAndroidRetrieval", "type": "None", "config": "default", "split": "test", "revision": "f46a197baaae43b4f621051089b82a364682dfeb"}, "metrics": [{"type": "map_at_1", "value": 17.7}, {"type": "map_at_10", "value": 23.182}, {"type": "map_at_100", "value": 24.2}, {"type": "map_at_1000", "value": 24.354}, {"type": "map_at_3", "value": 21.448}, {"type": "map_at_5", "value": 22.394}, {"type": "mrr_at_1", "value": 21.459}, {"type": "mrr_at_10", "value": 27.538}, {"type": "mrr_at_100", "value": 28.399}, {"type": "mrr_at_1000", "value": 28.479}, {"type": "mrr_at_3", "value": 25.775}, {"type": "mrr_at_5", "value": 26.705000000000002}, {"type": "ndcg_at_1", "value": 21.459}, {"type": "ndcg_at_10", "value": 26.987}, {"type": "ndcg_at_100", "value": 31.935999999999996}, {"type": "ndcg_at_1000", "value": 35.335}, {"type": "ndcg_at_3", "value": 24.214}, {"type": "ndcg_at_5", "value": 25.344}, {"type": "precision_at_1", "value": 21.459}, {"type": "precision_at_10", "value": 5.007000000000001}, {"type": "precision_at_100", "value": 0.9299999999999999}, {"type": "precision_at_1000", "value": 0.149}, {"type": "precision_at_3", "value": 11.445}, {"type": "precision_at_5", "value": 8.155}, {"type": "recall_at_1", "value": 17.7}, {"type": "recall_at_10", "value": 33.698}, {"type": "recall_at_100", "value": 55.933}, {"type": "recall_at_1000", "value": 79.567}, {"type": "recall_at_3", "value": 25.331}, {"type": "recall_at_5", "value": 28.681}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackEnglishRetrieval", "type": "None", "config": "default", "split": "test", "revision": "ad9991cb51e31e31e430383c75ffb2885547b5f0"}, "metrics": [{"type": "map_at_1", "value": 13.008000000000001}, {"type": "map_at_10", "value": 17.331}, {"type": "map_at_100", "value": 18.128}, {"type": "map_at_1000", "value": 18.253}, {"type": "map_at_3", "value": 15.708}, {"type": "map_at_5", "value": 16.601}, {"type": "mrr_at_1", "value": 16.624}, {"type": "mrr_at_10", "value": 21.038999999999998}, {"type": "mrr_at_100", "value": 21.782}, {"type": "mrr_at_1000", "value": 21.869}, {"type": "mrr_at_3", "value": 19.320999999999998}, {"type": "mrr_at_5", "value": 20.266000000000002}, {"type": "ndcg_at_1", "value": 16.624}, {"type": "ndcg_at_10", "value": 20.584}, {"type": "ndcg_at_100", "value": 24.43}, {"type": "ndcg_at_1000", "value": 27.486}, {"type": "ndcg_at_3", "value": 17.724999999999998}, {"type": "ndcg_at_5", "value": 18.990000000000002}, {"type": "precision_at_1", "value": 16.624}, {"type": "precision_at_10", "value": 3.8850000000000002}, {"type": "precision_at_100", "value": 0.7250000000000001}, {"type": "precision_at_1000", "value": 0.122}, {"type": "precision_at_3", "value": 8.514}, {"type": "precision_at_5", "value": 6.204}, {"type": "recall_at_1", "value": 13.008000000000001}, {"type": "recall_at_10", "value": 26.799}, {"type": "recall_at_100", "value": 43.802}, {"type": "recall_at_1000", "value": 65.035}, {"type": "recall_at_3", "value": 18.411}, {"type": "recall_at_5", "value": 21.887999999999998}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGamingRetrieval", "type": "None", "config": "default", "split": "test", "revision": "4885aa143210c98657558c04aaf3dc47cfb54340"}, "metrics": [{"type": "map_at_1", "value": 18.459}, {"type": "map_at_10", "value": 24.775}, {"type": "map_at_100", "value": 25.691999999999997}, {"type": "map_at_1000", "value": 25.802999999999997}, {"type": "map_at_3", "value": 22.784}, {"type": "map_at_5", "value": 23.764}, {"type": "mrr_at_1", "value": 21.379}, {"type": "mrr_at_10", "value": 27.555000000000003}, {"type": "mrr_at_100", "value": 28.355000000000004}, {"type": "mrr_at_1000", "value": 28.438999999999997}, {"type": "mrr_at_3", "value": 25.663999999999998}, {"type": "mrr_at_5", "value": 26.598}, {"type": "ndcg_at_1", "value": 21.379}, {"type": "ndcg_at_10", "value": 28.691}, {"type": "ndcg_at_100", "value": 33.387}, {"type": "ndcg_at_1000", "value": 36.299}, {"type": "ndcg_at_3", "value": 24.883}, {"type": "ndcg_at_5", "value": 26.438}, {"type": "precision_at_1", "value": 21.379}, {"type": "precision_at_10", "value": 4.777}, {"type": "precision_at_100", "value": 0.7799999999999999}, {"type": "precision_at_1000", "value": 0.11199999999999999}, {"type": "precision_at_3", "value": 11.16}, {"type": "precision_at_5", "value": 7.7490000000000006}, {"type": "recall_at_1", "value": 18.459}, {"type": "recall_at_10", "value": 37.964999999999996}, {"type": "recall_at_100", "value": 59.728}, {"type": "recall_at_1000", "value": 81.351}, {"type": "recall_at_3", "value": 27.538}, {"type": "recall_at_5", "value": 31.464}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGisRetrieval", "type": "None", "config": "default", "split": "test", "revision": "5003b3064772da1887988e05400cf3806fe491f2"}, "metrics": [{"type": "map_at_1", "value": 8.324}, {"type": "map_at_10", "value": 10.779}, {"type": "map_at_100", "value": 11.371}, {"type": "map_at_1000", "value": 11.466999999999999}, {"type": "map_at_3", "value": 9.922}, {"type": "map_at_5", "value": 10.319}, {"type": "mrr_at_1", "value": 9.153}, {"type": "mrr_at_10", "value": 11.700000000000001}, {"type": "mrr_at_100", "value": 12.314}, {"type": "mrr_at_1000", "value": 12.406}, {"type": "mrr_at_3", "value": 10.81}, {"type": "mrr_at_5", "value": 11.234}, {"type": "ndcg_at_1", "value": 9.153}, {"type": "ndcg_at_10", "value": 12.472}, {"type": "ndcg_at_100", "value": 15.942}, {"type": "ndcg_at_1000", "value": 19.118}, {"type": "ndcg_at_3", "value": 10.644}, {"type": "ndcg_at_5", "value": 11.355}, {"type": "precision_at_1", "value": 9.153}, {"type": "precision_at_10", "value": 1.921}, {"type": "precision_at_100", "value": 0.391}, {"type": "precision_at_1000", "value": 0.07100000000000001}, {"type": "precision_at_3", "value": 4.444}, {"type": "precision_at_5", "value": 3.073}, {"type": "recall_at_1", "value": 8.324}, {"type": "recall_at_10", "value": 16.971}, {"type": "recall_at_100", "value": 34.041}, {"type": "recall_at_1000", "value": 59.45399999999999}, {"type": "recall_at_3", "value": 11.77}, {"type": "recall_at_5", "value": 13.522}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackMathematicaRetrieval", "type": "None", "config": "default", "split": "test", "revision": "90fceea13679c63fe563ded68f3b6f06e50061de"}, "metrics": [{"type": "map_at_1", "value": 3.998}, {"type": "map_at_10", "value": 6.22}, {"type": "map_at_100", "value": 6.687}, {"type": "map_at_1000", "value": 6.796}, {"type": "map_at_3", "value": 5.124}, {"type": "map_at_5", "value": 5.705}, {"type": "mrr_at_1", "value": 5.224}, {"type": "mrr_at_10", "value": 7.915}, {"type": "mrr_at_100", "value": 8.433}, {"type": "mrr_at_1000", "value": 8.530999999999999}, {"type": "mrr_at_3", "value": 6.654}, {"type": "mrr_at_5", "value": 7.276000000000001}, {"type": "ndcg_at_1", "value": 5.224}, {"type": "ndcg_at_10", "value": 8.238}, {"type": "ndcg_at_100", "value": 11.126999999999999}, {"type": "ndcg_at_1000", "value": 14.552999999999999}, {"type": "ndcg_at_3", "value": 6.0249999999999995}, {"type": "ndcg_at_5", "value": 6.981999999999999}, {"type": "precision_at_1", "value": 5.224}, {"type": "precision_at_10", "value": 1.7160000000000002}, {"type": "precision_at_100", "value": 0.371}, {"type": "precision_at_1000", "value": 0.078}, {"type": "precision_at_3", "value": 2.9850000000000003}, {"type": "precision_at_5", "value": 2.413}, {"type": "recall_at_1", "value": 3.998}, {"type": "recall_at_10", "value": 12.995999999999999}, {"type": "recall_at_100", "value": 26.819}, {"type": "recall_at_1000", "value": 52.608}, {"type": "recall_at_3", "value": 6.721000000000001}, {"type": "recall_at_5", "value": 9.198}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackPhysicsRetrieval", "type": "None", "config": "default", "split": "test", "revision": "79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4"}, "metrics": [{"type": "map_at_1", "value": 12.331}, {"type": "map_at_10", "value": 16.913}, {"type": "map_at_100", "value": 17.841}, {"type": "map_at_1000", "value": 17.977}, {"type": "map_at_3", "value": 15.633}, {"type": "map_at_5", "value": 16.256}, {"type": "mrr_at_1", "value": 15.110999999999999}, {"type": "mrr_at_10", "value": 20.419999999999998}, {"type": "mrr_at_100", "value": 21.294}, {"type": "mrr_at_1000", "value": 21.386}, {"type": "mrr_at_3", "value": 18.961}, {"type": "mrr_at_5", "value": 19.682}, {"type": "ndcg_at_1", "value": 15.110999999999999}, {"type": "ndcg_at_10", "value": 20.115}, {"type": "ndcg_at_100", "value": 24.914}, {"type": "ndcg_at_1000", "value": 28.375}, {"type": "ndcg_at_3", "value": 17.732}, {"type": "ndcg_at_5", "value": 18.658}, {"type": "precision_at_1", "value": 15.110999999999999}, {"type": "precision_at_10", "value": 3.696}, {"type": "precision_at_100", "value": 0.762}, {"type": "precision_at_1000", "value": 0.125}, {"type": "precision_at_3", "value": 8.566}, {"type": "precision_at_5", "value": 5.9670000000000005}, {"type": "recall_at_1", "value": 12.331}, {"type": "recall_at_10", "value": 26.429000000000002}, {"type": "recall_at_100", "value": 47.341}, {"type": "recall_at_1000", "value": 72.149}, {"type": "recall_at_3", "value": 19.467000000000002}, {"type": "recall_at_5", "value": 21.981}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackProgrammersRetrieval", "type": "None", "config": "default", "split": "test", "revision": "6184bc1440d2dbc7612be22b50686b8826d22b32"}, "metrics": [{"type": "map_at_1", "value": 8.262}, {"type": "map_at_10", "value": 11.962}, {"type": "map_at_100", "value": 12.729}, {"type": "map_at_1000", "value": 12.86}, {"type": "map_at_3", "value": 10.65}, {"type": "map_at_5", "value": 11.388}, {"type": "mrr_at_1", "value": 10.502}, {"type": "mrr_at_10", "value": 14.715}, {"type": "mrr_at_100", "value": 15.484}, {"type": "mrr_at_1000", "value": 15.581999999999999}, {"type": "mrr_at_3", "value": 13.299}, {"type": "mrr_at_5", "value": 14.097999999999999}, {"type": "ndcg_at_1", "value": 10.502}, {"type": "ndcg_at_10", "value": 14.649000000000001}, {"type": "ndcg_at_100", "value": 18.738}, {"type": "ndcg_at_1000", "value": 22.456}, {"type": "ndcg_at_3", "value": 12.222}, {"type": "ndcg_at_5", "value": 13.314}, {"type": "precision_at_1", "value": 10.502}, {"type": "precision_at_10", "value": 2.82}, {"type": "precision_at_100", "value": 0.588}, {"type": "precision_at_1000", "value": 0.108}, {"type": "precision_at_3", "value": 5.936}, {"type": "precision_at_5", "value": 4.452}, {"type": "recall_at_1", "value": 8.262}, {"type": "recall_at_10", "value": 20.168}, {"type": "recall_at_100", "value": 38.405}, {"type": "recall_at_1000", "value": 65.694}, {"type": "recall_at_3", "value": 13.428999999999998}, {"type": "recall_at_5", "value": 16.229}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackRetrieval", "type": "mteb/cqadupstack", "config": "default", "split": "test", "revision": "4885aa143210c98657558c04aaf3dc47cfb54340"}, "metrics": [{"type": "map_at_1", "value": 10.117416666666665}, {"type": "map_at_10", "value": 13.858333333333334}, {"type": "map_at_100", "value": 14.565166666666668}, {"type": "map_at_1000", "value": 14.68266666666667}, {"type": "map_at_3", "value": 12.60983333333333}, {"type": "map_at_5", "value": 13.277416666666667}, {"type": "mrr_at_1", "value": 12.332833333333335}, {"type": "mrr_at_10", "value": 16.376333333333335}, {"type": "mrr_at_100", "value": 17.063333333333333}, {"type": "mrr_at_1000", "value": 17.1535}, {"type": "mrr_at_3", "value": 15.040666666666667}, {"type": "mrr_at_5", "value": 15.764833333333334}, {"type": "ndcg_at_1", "value": 12.332833333333335}, {"type": "ndcg_at_10", "value": 16.51366666666667}, {"type": "ndcg_at_100", "value": 20.2845}, {"type": "ndcg_at_1000", "value": 23.54025}, {"type": "ndcg_at_3", "value": 14.171250000000002}, {"type": "ndcg_at_5", "value": 15.193583333333333}, {"type": "precision_at_1", "value": 12.332833333333335}, {"type": "precision_at_10", "value": 2.983083333333333}, {"type": "precision_at_100", "value": 0.58325}, {"type": "precision_at_1000", "value": 0.10250000000000001}, {"type": "precision_at_3", "value": 6.626083333333334}, {"type": "precision_at_5", "value": 4.774916666666665}, {"type": "recall_at_1", "value": 10.117416666666665}, {"type": "recall_at_10", "value": 22.14666666666667}, {"type": "recall_at_100", "value": 39.5745}, {"type": "recall_at_1000", "value": 63.73550000000001}, {"type": "recall_at_3", "value": 15.431666666666665}, {"type": "recall_at_5", "value": 18.1215}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackStatsRetrieval", "type": "None", "config": "default", "split": "test", "revision": "65ac3a16b8e91f9cee4c9828cc7c335575432a2a"}, "metrics": [{"type": "map_at_1", "value": 7.431}, {"type": "map_at_10", "value": 10.172}, {"type": "map_at_100", "value": 10.639999999999999}, {"type": "map_at_1000", "value": 10.716000000000001}, {"type": "map_at_3", "value": 9.242}, {"type": "map_at_5", "value": 9.614}, {"type": "mrr_at_1", "value": 9.202}, {"type": "mrr_at_10", "value": 12.08}, {"type": "mrr_at_100", "value": 12.58}, {"type": "mrr_at_1000", "value": 12.649}, {"type": "mrr_at_3", "value": 11.145}, {"type": "mrr_at_5", "value": 11.59}, {"type": "ndcg_at_1", "value": 9.202}, {"type": "ndcg_at_10", "value": 12.291}, {"type": "ndcg_at_100", "value": 14.940999999999999}, {"type": "ndcg_at_1000", "value": 17.325}, {"type": "ndcg_at_3", "value": 10.446}, {"type": "ndcg_at_5", "value": 11.027000000000001}, {"type": "precision_at_1", "value": 9.202}, {"type": "precision_at_10", "value": 2.193}, {"type": "precision_at_100", "value": 0.388}, {"type": "precision_at_1000", "value": 0.065}, {"type": "precision_at_3", "value": 4.806}, {"type": "precision_at_5", "value": 3.374}, {"type": "recall_at_1", "value": 7.431}, {"type": "recall_at_10", "value": 17.197000000000003}, {"type": "recall_at_100", "value": 29.704000000000004}, {"type": "recall_at_1000", "value": 48.278999999999996}, {"type": "recall_at_3", "value": 11.616999999999999}, {"type": "recall_at_5", "value": 13.181000000000001}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackTexRetrieval", "type": "None", "config": "default", "split": "test", "revision": "46989137a86843e03a6195de44b09deda022eec7"}, "metrics": [{"type": "map_at_1", "value": 5.348}, {"type": "map_at_10", "value": 7.591}, {"type": "map_at_100", "value": 8.109}, {"type": "map_at_1000", "value": 8.206}, {"type": "map_at_3", "value": 6.782000000000001}, {"type": "map_at_5", "value": 7.244000000000001}, {"type": "mrr_at_1", "value": 6.641}, {"type": "mrr_at_10", "value": 9.281}, {"type": "mrr_at_100", "value": 9.838}, {"type": "mrr_at_1000", "value": 9.922}, {"type": "mrr_at_3", "value": 8.286999999999999}, {"type": "mrr_at_5", "value": 8.866999999999999}, {"type": "ndcg_at_1", "value": 6.641}, {"type": "ndcg_at_10", "value": 9.302000000000001}, {"type": "ndcg_at_100", "value": 12.200999999999999}, {"type": "ndcg_at_1000", "value": 15.223999999999998}, {"type": "ndcg_at_3", "value": 7.692}, {"type": "ndcg_at_5", "value": 8.474}, {"type": "precision_at_1", "value": 6.641}, {"type": "precision_at_10", "value": 1.755}, {"type": "precision_at_100", "value": 0.388}, {"type": "precision_at_1000", "value": 0.079}, {"type": "precision_at_3", "value": 3.6249999999999996}, {"type": "precision_at_5", "value": 2.753}, {"type": "recall_at_1", "value": 5.348}, {"type": "recall_at_10", "value": 12.887}, {"type": "recall_at_100", "value": 26.391}, {"type": "recall_at_1000", "value": 49.156}, {"type": "recall_at_3", "value": 8.519}, {"type": "recall_at_5", "value": 10.431}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackUnixRetrieval", "type": "None", "config": "default", "split": "test", "revision": "6c6430d3a6d36f8d2a829195bc5dc94d7e063e53"}, "metrics": [{"type": "map_at_1", "value": 7.9750000000000005}, {"type": "map_at_10", "value": 11.28}, {"type": "map_at_100", "value": 11.953}, {"type": "map_at_1000", "value": 12.051}, {"type": "map_at_3", "value": 10.022}, {"type": "map_at_5", "value": 10.807}, {"type": "mrr_at_1", "value": 9.795}, {"type": "mrr_at_10", "value": 13.544999999999998}, {"type": "mrr_at_100", "value": 14.249999999999998}, {"type": "mrr_at_1000", "value": 14.341000000000001}, {"type": "mrr_at_3", "value": 12.174}, {"type": "mrr_at_5", "value": 13.041}, {"type": "ndcg_at_1", "value": 9.795}, {"type": "ndcg_at_10", "value": 13.697000000000001}, {"type": "ndcg_at_100", "value": 17.389}, {"type": "ndcg_at_1000", "value": 20.46}, {"type": "ndcg_at_3", "value": 11.277}, {"type": "ndcg_at_5", "value": 12.579}, {"type": "precision_at_1", "value": 9.795}, {"type": "precision_at_10", "value": 2.435}, {"type": "precision_at_100", "value": 0.481}, {"type": "precision_at_1000", "value": 0.084}, {"type": "precision_at_3", "value": 5.255}, {"type": "precision_at_5", "value": 3.955}, {"type": "recall_at_1", "value": 7.9750000000000005}, {"type": "recall_at_10", "value": 18.981}, {"type": "recall_at_100", "value": 36.178}, {"type": "recall_at_1000", "value": 59.46900000000001}, {"type": "recall_at_3", "value": 12.371}, {"type": "recall_at_5", "value": 15.613}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWebmastersRetrieval", "type": "None", "config": "default", "split": "test", "revision": "160c094312a0e1facb97e55eeddb698c0abe3571"}, "metrics": [{"type": "map_at_1", "value": 10.742}, {"type": "map_at_10", "value": 15.346000000000002}, {"type": "map_at_100", "value": 16.153000000000002}, {"type": "map_at_1000", "value": 16.311999999999998}, {"type": "map_at_3", "value": 14.222999999999999}, {"type": "map_at_5", "value": 14.777000000000001}, {"type": "mrr_at_1", "value": 14.032}, {"type": "mrr_at_10", "value": 18.83}, {"type": "mrr_at_100", "value": 19.564999999999998}, {"type": "mrr_at_1000", "value": 19.655}, {"type": "mrr_at_3", "value": 17.523}, {"type": "mrr_at_5", "value": 18.244}, {"type": "ndcg_at_1", "value": 14.032}, {"type": "ndcg_at_10", "value": 18.496000000000002}, {"type": "ndcg_at_100", "value": 22.377}, {"type": "ndcg_at_1000", "value": 26.284000000000002}, {"type": "ndcg_at_3", "value": 16.520000000000003}, {"type": "ndcg_at_5", "value": 17.276}, {"type": "precision_at_1", "value": 14.032}, {"type": "precision_at_10", "value": 3.5770000000000004}, {"type": "precision_at_100", "value": 0.783}, {"type": "precision_at_1000", "value": 0.16}, {"type": "precision_at_3", "value": 7.971}, {"type": "precision_at_5", "value": 5.692}, {"type": "recall_at_1", "value": 10.742}, {"type": "recall_at_10", "value": 24.157999999999998}, {"type": "recall_at_100", "value": 42.091}, {"type": "recall_at_1000", "value": 70.054}, {"type": "recall_at_3", "value": 17.916999999999998}, {"type": "recall_at_5", "value": 20.131}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWordpressRetrieval", "type": "None", "config": "default", "split": "test", "revision": "4ffe81d471b1924886b33c7567bfb200e9eec5c4"}, "metrics": [{"type": "map_at_1", "value": 7.831}, {"type": "map_at_10", "value": 10.749}, {"type": "map_at_100", "value": 11.279}, {"type": "map_at_1000", "value": 11.397}, {"type": "map_at_3", "value": 9.78}, {"type": "map_at_5", "value": 10.459999999999999}, {"type": "mrr_at_1", "value": 8.872}, {"type": "mrr_at_10", "value": 11.898}, {"type": "mrr_at_100", "value": 12.466000000000001}, {"type": "mrr_at_1000", "value": 12.583}, {"type": "mrr_at_3", "value": 10.875}, {"type": "mrr_at_5", "value": 11.577}, {"type": "ndcg_at_1", "value": 8.872}, {"type": "ndcg_at_10", "value": 12.642000000000001}, {"type": "ndcg_at_100", "value": 16.032}, {"type": "ndcg_at_1000", "value": 19.567999999999998}, {"type": "ndcg_at_3", "value": 10.674999999999999}, {"type": "ndcg_at_5", "value": 11.886}, {"type": "precision_at_1", "value": 8.872}, {"type": "precision_at_10", "value": 2.015}, {"type": "precision_at_100", "value": 0.41200000000000003}, {"type": "precision_at_1000", "value": 0.077}, {"type": "precision_at_3", "value": 4.806}, {"type": "precision_at_5", "value": 3.512}, {"type": "recall_at_1", "value": 7.831}, {"type": "recall_at_10", "value": 17.511}, {"type": "recall_at_100", "value": 34.461000000000006}, {"type": "recall_at_1000", "value": 62.01}, {"type": "recall_at_3", "value": 12.089}, {"type": "recall_at_5", "value": 15.139}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ClimateFEVER", "type": "None", "config": "default", "split": "test", "revision": "47f2ac6acb640fc46020b02a5b59fdda04d39380"}, "metrics": [{"type": "map_at_1", "value": 3.3300000000000005}, {"type": "map_at_10", "value": 5.8709999999999996}, {"type": "map_at_100", "value": 6.7860000000000005}, {"type": "map_at_1000", "value": 6.955}, {"type": "map_at_3", "value": 4.714}, {"type": "map_at_5", "value": 5.26}, {"type": "mrr_at_1", "value": 7.101}, {"type": "mrr_at_10", "value": 12.125}, {"type": "mrr_at_100", "value": 13.200000000000001}, {"type": "mrr_at_1000", "value": 13.295000000000002}, {"type": "mrr_at_3", "value": 10.119}, {"type": "mrr_at_5", "value": 11.038}, {"type": "ndcg_at_1", "value": 7.101}, {"type": "ndcg_at_10", "value": 9.159}, {"type": "ndcg_at_100", "value": 14.030000000000001}, {"type": "ndcg_at_1000", "value": 18.013}, {"type": "ndcg_at_3", "value": 6.6739999999999995}, {"type": "ndcg_at_5", "value": 7.4719999999999995}, {"type": "precision_at_1", "value": 7.101}, {"type": "precision_at_10", "value": 3.16}, {"type": "precision_at_100", "value": 0.84}, {"type": "precision_at_1000", "value": 0.156}, {"type": "precision_at_3", "value": 5.081}, {"type": "precision_at_5", "value": 4.143}, {"type": "recall_at_1", "value": 3.3300000000000005}, {"type": "recall_at_10", "value": 12.215}, {"type": "recall_at_100", "value": 29.683999999999997}, {"type": "recall_at_1000", "value": 52.951}, {"type": "recall_at_3", "value": 6.356000000000001}, {"type": "recall_at_5", "value": 8.315}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB DBPedia", "type": "None", "config": "default", "split": "test", "revision": "c0f706b76e590d620bd6618b3ca8efdd34e2d659"}, "metrics": [{"type": "map_at_1", "value": 1.718}, {"type": "map_at_10", "value": 3.639}, {"type": "map_at_100", "value": 4.853}, {"type": "map_at_1000", "value": 5.219}, {"type": "map_at_3", "value": 2.6149999999999998}, {"type": "map_at_5", "value": 3.073}, {"type": "mrr_at_1", "value": 20.0}, {"type": "mrr_at_10", "value": 26.88}, {"type": "mrr_at_100", "value": 27.753}, {"type": "mrr_at_1000", "value": 27.822000000000003}, {"type": "mrr_at_3", "value": 24.667}, {"type": "mrr_at_5", "value": 25.654}, {"type": "ndcg_at_1", "value": 15.0}, {"type": "ndcg_at_10", "value": 10.878}, {"type": "ndcg_at_100", "value": 12.011}, {"type": "ndcg_at_1000", "value": 16.492}, {"type": "ndcg_at_3", "value": 12.818999999999999}, {"type": "ndcg_at_5", "value": 11.554}, {"type": "precision_at_1", "value": 20.0}, {"type": "precision_at_10", "value": 9.625}, {"type": "precision_at_100", "value": 3.037}, {"type": "precision_at_1000", "value": 0.7080000000000001}, {"type": "precision_at_3", "value": 15.082999999999998}, {"type": "precision_at_5", "value": 12.1}, {"type": "recall_at_1", "value": 1.718}, {"type": "recall_at_10", "value": 5.716}, {"type": "recall_at_100", "value": 14.266000000000002}, {"type": "recall_at_1000", "value": 30.012}, {"type": "recall_at_3", "value": 3.108}, {"type": "recall_at_5", "value": 4.181}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB EmotionClassification", "type": "None", "config": "default", "split": "test", "revision": "4f58c6b202a23cf9a4da393831edf4f9183cad37"}, "metrics": [{"type": "accuracy", "value": 41.114999999999995}, {"type": "f1", "value": 37.00141090816854}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FEVER", "type": "None", "config": "default", "split": "test", "revision": "bea83ef9e8fb933d90a2f1d5515737465d613e12"}, "metrics": [{"type": "map_at_1", "value": 5.523}, {"type": "map_at_10", "value": 8.036}, {"type": "map_at_100", "value": 8.581999999999999}, {"type": "map_at_1000", "value": 8.657}, {"type": "map_at_3", "value": 7.13}, {"type": "map_at_5", "value": 7.536}, {"type": "mrr_at_1", "value": 5.836}, {"type": "mrr_at_10", "value": 8.547}, {"type": "mrr_at_100", "value": 9.123000000000001}, {"type": "mrr_at_1000", "value": 9.197}, {"type": "mrr_at_3", "value": 7.563000000000001}, {"type": "mrr_at_5", "value": 8.006}, {"type": "ndcg_at_1", "value": 5.836}, {"type": "ndcg_at_10", "value": 9.764000000000001}, {"type": "ndcg_at_100", "value": 12.866}, {"type": "ndcg_at_1000", "value": 15.243}, {"type": "ndcg_at_3", "value": 7.7700000000000005}, {"type": "ndcg_at_5", "value": 8.518}, {"type": "precision_at_1", "value": 5.836}, {"type": "precision_at_10", "value": 1.6070000000000002}, {"type": "precision_at_100", "value": 0.331}, {"type": "precision_at_1000", "value": 0.055}, {"type": "precision_at_3", "value": 3.2849999999999997}, {"type": "precision_at_5", "value": 2.37}, {"type": "recall_at_1", "value": 5.523}, {"type": "recall_at_10", "value": 14.795}, {"type": "recall_at_100", "value": 29.932}, {"type": "recall_at_1000", "value": 48.946}, {"type": "recall_at_3", "value": 9.208}, {"type": "recall_at_5", "value": 10.984}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FiQA2018", "type": "None", "config": "default", "split": "test", "revision": "27a168819829fe9bcd655c2df245fb19452e8e06"}, "metrics": [{"type": "map_at_1", "value": 4.135}, {"type": "map_at_10", "value": 6.433999999999999}, {"type": "map_at_100", "value": 7.196}, {"type": "map_at_1000", "value": 7.356999999999999}, {"type": "map_at_3", "value": 5.339}, {"type": "map_at_5", "value": 5.878}, {"type": "mrr_at_1", "value": 8.796}, {"type": "mrr_at_10", "value": 12.357999999999999}, {"type": "mrr_at_100", "value": 13.208}, {"type": "mrr_at_1000", "value": 13.318}, {"type": "mrr_at_3", "value": 10.777000000000001}, {"type": "mrr_at_5", "value": 11.525}, {"type": "ndcg_at_1", "value": 8.796}, {"type": "ndcg_at_10", "value": 9.332}, {"type": "ndcg_at_100", "value": 13.517999999999999}, {"type": "ndcg_at_1000", "value": 17.907999999999998}, {"type": "ndcg_at_3", "value": 7.481999999999999}, {"type": "ndcg_at_5", "value": 8.065}, {"type": "precision_at_1", "value": 8.796}, {"type": "precision_at_10", "value": 2.8240000000000003}, {"type": "precision_at_100", "value": 0.705}, {"type": "precision_at_1000", "value": 0.14400000000000002}, {"type": "precision_at_3", "value": 4.887}, {"type": "precision_at_5", "value": 3.8580000000000005}, {"type": "recall_at_1", "value": 4.135}, {"type": "recall_at_10", "value": 12.292}, {"type": "recall_at_100", "value": 28.915999999999997}, {"type": "recall_at_1000", "value": 57.477999999999994}, {"type": "recall_at_3", "value": 6.747}, {"type": "recall_at_5", "value": 8.667}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB HotpotQA", "type": "None", "config": "default", "split": "test", "revision": "ab518f4d6fcca38d87c25209f94beba119d02014"}, "metrics": [{"type": "map_at_1", "value": 5.928}, {"type": "map_at_10", "value": 8.469}, {"type": "map_at_100", "value": 8.936}, {"type": "map_at_1000", "value": 9.02}, {"type": "map_at_3", "value": 7.582}, {"type": "map_at_5", "value": 8.021}, {"type": "mrr_at_1", "value": 11.857}, {"type": "mrr_at_10", "value": 15.675}, {"type": "mrr_at_100", "value": 16.273}, {"type": "mrr_at_1000", "value": 16.356}, {"type": "mrr_at_3", "value": 14.347999999999999}, {"type": "mrr_at_5", "value": 14.995}, {"type": "ndcg_at_1", "value": 11.857}, {"type": "ndcg_at_10", "value": 11.651}, {"type": "ndcg_at_100", "value": 14.374999999999998}, {"type": "ndcg_at_1000", "value": 16.912}, {"type": "ndcg_at_3", "value": 9.625}, {"type": "ndcg_at_5", "value": 10.474}, {"type": "precision_at_1", "value": 11.857}, {"type": "precision_at_10", "value": 2.777}, {"type": "precision_at_100", "value": 0.503}, {"type": "precision_at_1000", "value": 0.08499999999999999}, {"type": "precision_at_3", "value": 6.140000000000001}, {"type": "precision_at_5", "value": 4.362}, {"type": "recall_at_1", "value": 5.928}, {"type": "recall_at_10", "value": 13.883000000000001}, {"type": "recall_at_100", "value": 25.137999999999998}, {"type": "recall_at_1000", "value": 42.315999999999995}, {"type": "recall_at_3", "value": 9.21}, {"type": "recall_at_5", "value": 10.905}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ImdbClassification", "type": "None", "config": "default", "split": "test", "revision": "3d86128a09e091d6018b6d26cad27f2739fc2db7"}, "metrics": [{"type": "accuracy", "value": 65.4388}, {"type": "ap", "value": 60.440774024423426}, {"type": "f1", "value": 65.31315753102281}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MSMARCO", "type": "None", "config": "default", "split": "dev", "revision": "c5a29a104738b98a9e76336939199e264163d4a0"}, "metrics": [{"type": "map_at_1", "value": 3.4479999999999995}, {"type": "map_at_10", "value": 5.74}, {"type": "map_at_100", "value": 6.2780000000000005}, {"type": "map_at_1000", "value": 6.358999999999999}, {"type": "map_at_3", "value": 4.82}, {"type": "map_at_5", "value": 5.3}, {"type": "mrr_at_1", "value": 3.5389999999999997}, {"type": "mrr_at_10", "value": 5.906000000000001}, {"type": "mrr_at_100", "value": 6.455}, {"type": "mrr_at_1000", "value": 6.5360000000000005}, {"type": "mrr_at_3", "value": 4.9639999999999995}, {"type": "mrr_at_5", "value": 5.453}, {"type": "ndcg_at_1", "value": 3.5389999999999997}, {"type": "ndcg_at_10", "value": 7.255000000000001}, {"type": "ndcg_at_100", "value": 10.308}, {"type": "ndcg_at_1000", "value": 12.93}, {"type": "ndcg_at_3", "value": 5.314}, {"type": "ndcg_at_5", "value": 6.184}, {"type": "precision_at_1", "value": 3.5389999999999997}, {"type": "precision_at_10", "value": 1.246}, {"type": "precision_at_100", "value": 0.28500000000000003}, {"type": "precision_at_1000", "value": 0.051000000000000004}, {"type": "precision_at_3", "value": 2.297}, {"type": "precision_at_5", "value": 1.814}, {"type": "recall_at_1", "value": 3.4479999999999995}, {"type": "recall_at_10", "value": 11.982}, {"type": "recall_at_100", "value": 27.123}, {"type": "recall_at_1000", "value": 48.489}, {"type": "recall_at_3", "value": 6.607}, {"type": "recall_at_5", "value": 8.706}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPDomainClassification (en)", "type": "None", "config": "en", "split": "test", "revision": "d80d48c1eb48d3562165c59d59d0034df9fff0bf"}, "metrics": [{"type": "accuracy", "value": 85.9484724122207}, {"type": "f1", "value": 85.39768490584245}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPIntentClassification (en)", "type": "None", "config": "en", "split": "test", "revision": "ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba"}, "metrics": [{"type": "accuracy", "value": 58.48837209302326}, {"type": "f1", "value": 39.10849416181491}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (en)", "type": "None", "config": "en", "split": "test", "revision": "31efe3c427b0bae9c22cbb560b8f15491cc6bed7"}, "metrics": [{"type": "accuracy", "value": 60.632145258910555}, {"type": "f1", "value": 58.09773014884143}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (en)", "type": "None", "config": "en", "split": "test", "revision": "7d571f92784cd94a019292a1f45445077d0ef634"}, "metrics": [{"type": "accuracy", "value": 67.68325487558843}, {"type": "f1", "value": 65.91204845805859}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringP2P", "type": "None", "config": "default", "split": "test", "revision": "e7a26af6f3ae46b30dde8737f02c07b1505bcc73"}, "metrics": [{"type": "v_measure", "value": 26.41069242141184}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringS2S", "type": "None", "config": "default", "split": "test", "revision": "35191c8c0dca72d8ff3efcd72aa802307d469663"}, "metrics": [{"type": "v_measure", "value": 23.307848920918044}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB MindSmallReranking", "type": "None", "config": "default", "split": "test", "revision": "3bdac13927fdc888b903db93b2ffdbd90b295a69"}, "metrics": [{"type": "map", "value": 28.270878365120332}, {"type": "mrr", "value": 29.057926505909254}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NFCorpus", "type": "None", "config": "default", "split": "test", "revision": "ec0fa4fe99da2ff19ca1214b7966684033a58814"}, "metrics": [{"type": "map_at_1", "value": 1.855}, {"type": "map_at_10", "value": 3.582}, {"type": "map_at_100", "value": 4.694}, {"type": "map_at_1000", "value": 5.739}, {"type": "map_at_3", "value": 2.677}, {"type": "map_at_5", "value": 3.1}, {"type": "mrr_at_1", "value": 18.884999999999998}, {"type": "mrr_at_10", "value": 27.256999999999998}, {"type": "mrr_at_100", "value": 28.327999999999996}, {"type": "mrr_at_1000", "value": 28.402}, {"type": "mrr_at_3", "value": 24.2}, {"type": "mrr_at_5", "value": 26.011}, {"type": "ndcg_at_1", "value": 17.957}, {"type": "ndcg_at_10", "value": 14.051}, {"type": "ndcg_at_100", "value": 14.282}, {"type": "ndcg_at_1000", "value": 24.3}, {"type": "ndcg_at_3", "value": 15.478}, {"type": "ndcg_at_5", "value": 14.782}, {"type": "precision_at_1", "value": 18.884999999999998}, {"type": "precision_at_10", "value": 10.743}, {"type": "precision_at_100", "value": 4.449}, {"type": "precision_at_1000", "value": 1.7670000000000001}, {"type": "precision_at_3", "value": 14.654}, {"type": "precision_at_5", "value": 12.940999999999999}, {"type": "recall_at_1", "value": 1.855}, {"type": "recall_at_10", "value": 6.861000000000001}, {"type": "recall_at_100", "value": 18.044}, {"type": "recall_at_1000", "value": 52.712}, {"type": "recall_at_3", "value": 3.3369999999999997}, {"type": "recall_at_5", "value": 4.562}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NQ", "type": "None", "config": "default", "split": "test", "revision": "b774495ed302d8c44a3a7ea25c90dbce03968f31"}, "metrics": [{"type": "map_at_1", "value": 4.881}, {"type": "map_at_10", "value": 8.241999999999999}, {"type": "map_at_100", "value": 8.956999999999999}, {"type": "map_at_1000", "value": 9.062000000000001}, {"type": "map_at_3", "value": 6.981}, {"type": "map_at_5", "value": 7.61}, {"type": "mrr_at_1", "value": 5.5329999999999995}, {"type": "mrr_at_10", "value": 9.184000000000001}, {"type": "mrr_at_100", "value": 9.918000000000001}, {"type": "mrr_at_1000", "value": 10.018}, {"type": "mrr_at_3", "value": 7.836}, {"type": "mrr_at_5", "value": 8.518}, {"type": "ndcg_at_1", "value": 5.5329999999999995}, {"type": "ndcg_at_10", "value": 10.554}, {"type": "ndcg_at_100", "value": 14.341999999999999}, {"type": "ndcg_at_1000", "value": 17.458000000000002}, {"type": "ndcg_at_3", "value": 7.8759999999999994}, {"type": "ndcg_at_5", "value": 9.023}, {"type": "precision_at_1", "value": 5.5329999999999995}, {"type": "precision_at_10", "value": 1.944}, {"type": "precision_at_100", "value": 0.411}, {"type": "precision_at_1000", "value": 0.07100000000000001}, {"type": "precision_at_3", "value": 3.669}, {"type": "precision_at_5", "value": 2.8160000000000003}, {"type": "recall_at_1", "value": 4.881}, {"type": "recall_at_10", "value": 16.898}, {"type": "recall_at_100", "value": 34.625}, {"type": "recall_at_1000", "value": 58.901}, {"type": "recall_at_3", "value": 9.651}, {"type": "recall_at_5", "value": 12.35}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB QuoraRetrieval", "type": "None", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 53.159}, {"type": "map_at_10", "value": 64.053}, {"type": "map_at_100", "value": 64.938}, {"type": "map_at_1000", "value": 64.994}, {"type": "map_at_3", "value": 61.413}, {"type": "map_at_5", "value": 62.966}, {"type": "mrr_at_1", "value": 61.129999999999995}, {"type": "mrr_at_10", "value": 68.84400000000001}, {"type": "mrr_at_100", "value": 69.3}, {"type": "mrr_at_1000", "value": 69.319}, {"type": "mrr_at_3", "value": 67.113}, {"type": "mrr_at_5", "value": 68.162}, {"type": "ndcg_at_1", "value": 61.160000000000004}, {"type": "ndcg_at_10", "value": 68.944}, {"type": "ndcg_at_100", "value": 72.10499999999999}, {"type": "ndcg_at_1000", "value": 73.046}, {"type": "ndcg_at_3", "value": 65.223}, {"type": "ndcg_at_5", "value": 67.05}, {"type": "precision_at_1", "value": 61.160000000000004}, {"type": "precision_at_10", "value": 10.392999999999999}, {"type": "precision_at_100", "value": 1.327}, {"type": "precision_at_1000", "value": 0.149}, {"type": "precision_at_3", "value": 28.13}, {"type": "precision_at_5", "value": 18.656}, {"type": "recall_at_1", "value": 53.159}, {"type": "recall_at_10", "value": 78.412}, {"type": "recall_at_100", "value": 91.399}, {"type": "recall_at_1000", "value": 97.52}, {"type": "recall_at_3", "value": 67.794}, {"type": "recall_at_5", "value": 72.801}, {"type": "map_at_1", "value": 1.8450000000000002}, {"type": "map_at_10", "value": 4.172}, {"type": "map_at_100", "value": 5.092}, {"type": "map_at_1000", "value": 5.3100000000000005}, {"type": "map_at_3", "value": 3.093}, {"type": "map_at_5", "value": 3.6450000000000005}, {"type": "mrr_at_1", "value": 9.1}, {"type": "mrr_at_10", "value": 15.15}, {"type": "mrr_at_100", "value": 16.216}, {"type": "mrr_at_1000", "value": 16.332}, {"type": "mrr_at_3", "value": 12.55}, {"type": "mrr_at_5", "value": 13.975000000000001}, {"type": "ndcg_at_1", "value": 9.1}, {"type": "ndcg_at_10", "value": 8.065999999999999}, {"type": "ndcg_at_100", "value": 12.982}, {"type": "ndcg_at_1000", "value": 18.046}, {"type": "ndcg_at_3", "value": 7.295999999999999}, {"type": "ndcg_at_5", "value": 6.572}, {"type": "precision_at_1", "value": 9.1}, {"type": "precision_at_10", "value": 4.29}, {"type": "precision_at_100", "value": 1.16}, {"type": "precision_at_1000", "value": 0.23900000000000002}, {"type": "precision_at_3", "value": 6.833}, {"type": "precision_at_5", "value": 5.88}, {"type": "recall_at_1", "value": 1.8450000000000002}, {"type": "recall_at_10", "value": 8.706999999999999}, {"type": "recall_at_100", "value": 23.645}, {"type": "recall_at_1000", "value": 48.597}, {"type": "recall_at_3", "value": 4.175}, {"type": "recall_at_5", "value": 5.973}, {"type": "map_at_1", "value": 0.058}, {"type": "map_at_10", "value": 0.445}, {"type": "map_at_100", "value": 2.489}, {"type": "map_at_1000", "value": 6.3100000000000005}, {"type": "map_at_3", "value": 0.16999999999999998}, {"type": "map_at_5", "value": 0.254}, {"type": "mrr_at_1", "value": 32.0}, {"type": "mrr_at_10", "value": 46.016}, {"type": "mrr_at_100", "value": 46.683}, {"type": "mrr_at_1000", "value": 46.719}, {"type": "mrr_at_3", "value": 41.667}, {"type": "mrr_at_5", "value": 42.967}, {"type": "ndcg_at_1", "value": 26.0}, {"type": "ndcg_at_10", "value": 29.885}, {"type": "ndcg_at_100", "value": 22.958000000000002}, {"type": "ndcg_at_1000", "value": 22.244}, {"type": "ndcg_at_3", "value": 29.787999999999997}, {"type": "ndcg_at_5", "value": 29.494999999999997}, {"type": "precision_at_1", "value": 32.0}, {"type": "precision_at_10", "value": 33.800000000000004}, {"type": "precision_at_100", "value": 24.52}, {"type": "precision_at_1000", "value": 11.196}, {"type": "precision_at_3", "value": 35.333}, {"type": "precision_at_5", "value": 34.0}, {"type": "recall_at_1", "value": 0.058}, {"type": "recall_at_10", "value": 0.657}, {"type": "recall_at_100", "value": 5.069}, {"type": "recall_at_1000", "value": 22.447}, {"type": "recall_at_3", "value": 0.2}, {"type": "recall_at_5", "value": 0.32299999999999995}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClustering", "type": "None", "config": "default", "split": "test", "revision": "24640382cdbf8abc73003fb0fa6d111a705499eb"}, "metrics": [{"type": "v_measure", "value": 30.140589231842256}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClusteringP2P", "type": "None", "config": "default", "split": "test", "revision": "282350215ef01743dc01b456c7f5241fa8937f16"}, "metrics": [{"type": "v_measure", "value": 39.92770613505385}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB SICK-R", "type": "None", "config": "default", "split": "test", "revision": "a6ea5a8cab320b040a23452cc28066d9beae2cee"}, "metrics": [{"type": "cos_sim_pearson", "value": 75.59024815989618}, {"type": "cos_sim_spearman", "value": 68.11624653233133}, {"type": "euclidean_pearson", "value": 73.27920094980502}, {"type": "euclidean_spearman", "value": 68.11632959681863}, {"type": "manhattan_pearson", "value": 72.54935141266294}, {"type": "manhattan_spearman", "value": 67.12457070604133}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS12", "type": "None", "config": "default", "split": "test", "revision": "a0d554a64d88156834ff5ae9920b964011b16384"}, "metrics": [{"type": "cos_sim_pearson", "value": 69.40126270570799}, {"type": "cos_sim_spearman", "value": 62.14207404840335}, {"type": "euclidean_pearson", "value": 66.27602017682412}, {"type": "euclidean_spearman", "value": 62.143384728461314}, {"type": "manhattan_pearson", "value": 67.07706053549664}, {"type": "manhattan_spearman", "value": 63.06497657163255}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS13", "type": "None", "config": "default", "split": "test", "revision": "7e90230a92c190f1bf69ae9002b8cea547a64cca"}, "metrics": [{"type": "cos_sim_pearson", "value": 75.5989515866992}, {"type": "cos_sim_spearman", "value": 77.15211512453997}, {"type": "euclidean_pearson", "value": 76.70296919445704}, {"type": "euclidean_spearman", "value": 77.15215294384531}, {"type": "manhattan_pearson", "value": 77.00183340244841}, {"type": "manhattan_spearman", "value": 77.54347126493187}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS14", "type": "None", "config": "default", "split": "test", "revision": "6031580fec1f6af667f0bd2da0a551cf4f0b2375"}, "metrics": [{"type": "cos_sim_pearson", "value": 73.76592708615566}, {"type": "cos_sim_spearman", "value": 70.57102535486983}, {"type": "euclidean_pearson", "value": 73.16493844323281}, {"type": "euclidean_spearman", "value": 70.57101566858893}, {"type": "manhattan_pearson", "value": 73.3644832097739}, {"type": "manhattan_spearman", "value": 70.93527541966915}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS15", "type": "None", "config": "default", "split": "test", "revision": "ae752c7c21bf194d8b67fd573edf7ae58183cbe3"}, "metrics": [{"type": "cos_sim_pearson", "value": 75.95076880553377}, {"type": "cos_sim_spearman", "value": 77.68458699868269}, {"type": "euclidean_pearson", "value": 77.7470713475935}, {"type": "euclidean_spearman", "value": 77.6845933113232}, {"type": "manhattan_pearson", "value": 78.19369618957612}, {"type": "manhattan_spearman", "value": 78.11088657087784}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS16", "type": "None", "config": "default", "split": "test", "revision": "4d8694f8f0e0100860b497b999b3dbed754a0513"}, "metrics": [{"type": "cos_sim_pearson", "value": 71.9715763028299}, {"type": "cos_sim_spearman", "value": 73.53220647955904}, {"type": "euclidean_pearson", "value": 73.57406594330985}, {"type": "euclidean_spearman", "value": 73.53303581777323}, {"type": "manhattan_pearson", "value": 74.03967460920595}, {"type": "manhattan_spearman", "value": 74.05778553630698}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS17 (en-en)", "type": "None", "config": "en-en", "split": "test", "revision": "af5e6fb845001ecf41f4c1e033ce921939a2a68d"}, "metrics": [{"type": "cos_sim_pearson", "value": 78.73667148725723}, {"type": "cos_sim_spearman", "value": 80.81028828869353}, {"type": "euclidean_pearson", "value": 81.15810431179573}, {"type": "euclidean_spearman", "value": 80.81116429309112}, {"type": "manhattan_pearson", "value": 81.55719120035107}, {"type": "manhattan_spearman", "value": 81.20882260152872}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (en)", "type": "None", "config": "en", "split": "test", "revision": "eea2b4fe26a775864c896887d910b76a8098ad3f"}, "metrics": [{"type": "cos_sim_pearson", "value": 61.43534524580482}, {"type": "cos_sim_spearman", "value": 59.839157733781434}, {"type": "euclidean_pearson", "value": 61.83093863698779}, {"type": "euclidean_spearman", "value": 59.839157733781434}, {"type": "manhattan_pearson", "value": 62.55988010471628}, {"type": "manhattan_spearman", "value": 60.30306061143011}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSBenchmark", "type": "None", "config": "default", "split": "test", "revision": "b0fddb56ed78048fa8b90373c8a3cfc37b684831"}, "metrics": [{"type": "cos_sim_pearson", "value": 72.25188934839379}, {"type": "cos_sim_spearman", "value": 70.9113050369473}, {"type": "euclidean_pearson", "value": 72.68710352046212}, {"type": "euclidean_spearman", "value": 70.9113534378691}, {"type": "manhattan_pearson", "value": 73.09745859415004}, {"type": "manhattan_spearman", "value": 71.26505067192102}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB SciDocsRR", "type": "None", "config": "default", "split": "test", "revision": "d3c5e1fc0b855ab6097bf1cda04dd73947d7caab"}, "metrics": [{"type": "map", "value": 67.5036392977626}, {"type": "mrr", "value": 87.43891003694925}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SciFact", "type": "None", "config": "default", "split": "test", "revision": "0228b52cf27578f30900b9e5271d331663a030d7"}, "metrics": [{"type": "map_at_1", "value": 20.889}, {"type": "map_at_10", "value": 27.165}, {"type": "map_at_100", "value": 28.368}, {"type": "map_at_1000", "value": 28.483999999999998}, {"type": "map_at_3", "value": 25.180999999999997}, {"type": "map_at_5", "value": 26.269}, {"type": "mrr_at_1", "value": 22.0}, {"type": "mrr_at_10", "value": 28.512999999999998}, {"type": "mrr_at_100", "value": 29.531000000000002}, {"type": "mrr_at_1000", "value": 29.635}, {"type": "mrr_at_3", "value": 26.611}, {"type": "mrr_at_5", "value": 27.594}, {"type": "ndcg_at_1", "value": 22.0}, {"type": "ndcg_at_10", "value": 30.814000000000004}, {"type": "ndcg_at_100", "value": 36.647999999999996}, {"type": "ndcg_at_1000", "value": 39.81}, {"type": "ndcg_at_3", "value": 26.845999999999997}, {"type": "ndcg_at_5", "value": 28.677999999999997}, {"type": "precision_at_1", "value": 22.0}, {"type": "precision_at_10", "value": 4.5}, {"type": "precision_at_100", "value": 0.773}, {"type": "precision_at_1000", "value": 0.105}, {"type": "precision_at_3", "value": 10.778}, {"type": "precision_at_5", "value": 7.5329999999999995}, {"type": "recall_at_1", "value": 20.889}, {"type": "recall_at_10", "value": 40.861}, {"type": "recall_at_100", "value": 68.089}, {"type": "recall_at_1000", "value": 93.05}, {"type": "recall_at_3", "value": 30.083}, {"type": "recall_at_5", "value": 34.556}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB SprintDuplicateQuestions", "type": "None", "config": "default", "split": "test", "revision": "d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46"}, "metrics": [{"type": "cos_sim_accuracy", "value": 99.47524752475248}, {"type": "cos_sim_ap", "value": 75.756486791625}, {"type": "cos_sim_f1", "value": 70.0162074554295}, {"type": "cos_sim_precision", "value": 76.14571092831962}, {"type": "cos_sim_recall", "value": 64.8}, {"type": "dot_accuracy", "value": 99.47524752475248}, {"type": "dot_ap", "value": 75.756486791625}, {"type": "dot_f1", "value": 70.0162074554295}, {"type": "dot_precision", "value": 76.14571092831962}, {"type": "dot_recall", "value": 64.8}, {"type": "euclidean_accuracy", "value": 99.47524752475248}, {"type": "euclidean_ap", "value": 75.756486791625}, {"type": "euclidean_f1", "value": 70.0162074554295}, {"type": "euclidean_precision", "value": 76.14571092831962}, {"type": "euclidean_recall", "value": 64.8}, {"type": "manhattan_accuracy", "value": 99.53069306930693}, {"type": "manhattan_ap", "value": 78.93311079752957}, {"type": "manhattan_f1", "value": 72.61292166952545}, {"type": "manhattan_precision", "value": 84.77970627503338}, {"type": "manhattan_recall", "value": 63.5}, {"type": "max_accuracy", "value": 99.53069306930693}, {"type": "max_ap", "value": 78.93311079752957}, {"type": "max_f1", "value": 72.61292166952545}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClustering", "type": "None", "config": "default", "split": "test", "revision": "6cbc1f7b2bc0622f2e39d2c77fa502909748c259"}, "metrics": [{"type": "v_measure", "value": 38.956591584917824}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClusteringP2P", "type": "None", "config": "default", "split": "test", "revision": "815ca46b2622cec33ccafc3735d572c266efdb44"}, "metrics": [{"type": "v_measure", "value": 28.829387041051085}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB StackOverflowDupQuestions", "type": "None", "config": "default", "split": "test", "revision": "e185fbe320c72810689fc5848eb6114e1ef5ec69"}, "metrics": [{"type": "map", "value": 41.618168302388256}, {"type": "mrr", "value": 42.031210211357276}]}, {"task": {"type": "Summarization"}, "dataset": {"name": "MTEB SummEval", "type": "None", "config": "default", "split": "test", "revision": "cda12ad7615edc362dbf25a00fdd61d3b1eaf93c"}, "metrics": [{"type": "cos_sim_pearson", "value": 29.716182681356333}, {"type": "cos_sim_spearman", "value": 28.852160879670087}, {"type": "dot_pearson", "value": 29.716182648715844}, {"type": "dot_spearman", "value": 28.951026187665967}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB Touche2020", "type": "None", "config": "default", "split": "test", "revision": "a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f"}, "metrics": [{"type": "map_at_1", "value": 2.157}, {"type": "map_at_10", "value": 6.787999999999999}, {"type": "map_at_100", "value": 9.948}, {"type": "map_at_1000", "value": 11.331}, {"type": "map_at_3", "value": 4.642}, {"type": "map_at_5", "value": 5.718999999999999}, {"type": "mrr_at_1", "value": 28.571}, {"type": "mrr_at_10", "value": 39.195}, {"type": "mrr_at_100", "value": 40.778999999999996}, {"type": "mrr_at_1000", "value": 40.797}, {"type": "mrr_at_3", "value": 36.394999999999996}, {"type": "mrr_at_5", "value": 38.129000000000005}, {"type": "ndcg_at_1", "value": 28.571}, {"type": "ndcg_at_10", "value": 17.936}, {"type": "ndcg_at_100", "value": 26.552999999999997}, {"type": "ndcg_at_1000", "value": 38.318000000000005}, {"type": "ndcg_at_3", "value": 24.192}, {"type": "ndcg_at_5", "value": 21.732000000000003}, {"type": "precision_at_1", "value": 28.571}, {"type": "precision_at_10", "value": 14.285999999999998}, {"type": "precision_at_100", "value": 5.489999999999999}, {"type": "precision_at_1000", "value": 1.2710000000000001}, {"type": "precision_at_3", "value": 24.490000000000002}, {"type": "precision_at_5", "value": 20.816000000000003}, {"type": "recall_at_1", "value": 2.157}, {"type": "recall_at_10", "value": 9.729000000000001}, {"type": "recall_at_100", "value": 32.688}, {"type": "recall_at_1000", "value": 69.123}, {"type": "recall_at_3", "value": 5.26}, {"type": "recall_at_5", "value": 7.109}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ToxicConversationsClassification", "type": "None", "config": "default", "split": "test", "revision": "d7c0de2777da35d6aae2200a62c6e0e5af397c4c"}, "metrics": [{"type": "accuracy", "value": 67.9134}, {"type": "ap", "value": 12.774220384041032}, {"type": "f1", "value": 52.153059662642434}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB TweetSentimentExtractionClassification", "type": "None", "config": "default", "split": "test", "revision": "d604517c81ca91fe16a244d1248fc021f9ecee7a"}, "metrics": [{"type": "accuracy", "value": 53.613469156762875}, {"type": "f1", "value": 53.786522868566145}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB TwentyNewsgroupsClustering", "type": "None", "config": "default", "split": "test", "revision": "6125ec4e24fa026cec8a478383ee943acfbd5449"}, "metrics": [{"type": "v_measure", "value": 30.747359446594245}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterSemEval2015", "type": "None", "config": "default", "split": "test", "revision": "70970daeab8776df92f5ea462b6173c0b46fd2d1"}, "metrics": [{"type": "cos_sim_accuracy", "value": 83.97806520832091}, {"type": "cos_sim_ap", "value": 66.35427447671117}, {"type": "cos_sim_f1", "value": 63.0426851514046}, {"type": "cos_sim_precision", "value": 58.47056169636815}, {"type": "cos_sim_recall", "value": 68.3905013192612}, {"type": "dot_accuracy", "value": 83.97806520832091}, {"type": "dot_ap", "value": 66.35427447671117}, {"type": "dot_f1", "value": 63.0426851514046}, {"type": "dot_precision", "value": 58.47056169636815}, {"type": "dot_recall", "value": 68.3905013192612}, {"type": "euclidean_accuracy", "value": 83.97806520832091}, {"type": "euclidean_ap", "value": 66.35427447671117}, {"type": "euclidean_f1", "value": 63.0426851514046}, {"type": "euclidean_precision", "value": 58.47056169636815}, {"type": "euclidean_recall", "value": 68.3905013192612}, {"type": "manhattan_accuracy", "value": 83.97210466710378}, {"type": "manhattan_ap", "value": 65.97618382203181}, {"type": "manhattan_f1", "value": 62.53991648243675}, {"type": "manhattan_precision", "value": 58.501838235294116}, {"type": "manhattan_recall", "value": 67.17678100263852}, {"type": "max_accuracy", "value": 83.97806520832091}, {"type": "max_ap", "value": 66.35427447671117}, {"type": "max_f1", "value": 63.0426851514046}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterURLCorpus", "type": "None", "config": "default", "split": "test", "revision": "8b6510b0b1fa4e4c4f879467980e9be563ec1cdf"}, "metrics": [{"type": "cos_sim_accuracy", "value": 86.71362595567975}, {"type": "cos_sim_ap", "value": 80.86796720185393}, {"type": "cos_sim_f1", "value": 73.24097703244622}, {"type": "cos_sim_precision", "value": 69.5540783824955}, {"type": "cos_sim_recall", "value": 77.34062211271944}, {"type": "dot_accuracy", "value": 86.71362595567975}, {"type": "dot_ap", "value": 80.86797238493406}, {"type": "dot_f1", "value": 73.24097703244622}, {"type": "dot_precision", "value": 69.5540783824955}, {"type": "dot_recall", "value": 77.34062211271944}, {"type": "euclidean_accuracy", "value": 86.71362595567975}, {"type": "euclidean_ap", "value": 80.86796690301992}, {"type": "euclidean_f1", "value": 73.24097703244622}, {"type": "euclidean_precision", "value": 69.5540783824955}, {"type": "euclidean_recall", "value": 77.34062211271944}, {"type": "manhattan_accuracy", "value": 86.64376916210657}, {"type": "manhattan_ap", "value": 80.8520473693602}, {"type": "manhattan_f1", "value": 73.15887850467291}, {"type": "manhattan_precision", "value": 71.10158407208255}, {"type": "manhattan_recall", "value": 75.33877425315676}, {"type": "max_accuracy", "value": 86.71362595567975}, {"type": "max_ap", "value": 80.86797238493406}, {"type": "max_f1", "value": 73.24097703244622}]}]}]} |
|
minhtuan7akp/gte-base-vietnamese-finetune-matryoshka | minhtuan7akp | sentence-similarity | [
"sentence-transformers",
"safetensors",
"new",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:21892",
"loss:MatryoshkaLoss",
"loss:MultipleNegativesRankingLoss",
"custom_code",
"arxiv:1908.10084",
"arxiv:2205.13147",
"arxiv:1705.00652",
"base_model:Alibaba-NLP/gte-multilingual-base",
"base_model:finetune:Alibaba-NLP/gte-multilingual-base",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
]
| 2025-03-04T02:20:08 | 2025-03-04T02:22:44 | 10 | 0 | ---
base_model: Alibaba-NLP/gte-multilingual-base
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:21892
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: Sự khác biệt giữa các thời đại trong nghệ thuật trang trí rồng
được thể hiện như thế nào qua các thời Hùng Vương, Lý, Trần, Hồ, Lê, Mạc, Nguyễn?
sentences:
- "Tài liệu tham khảo\r\n323. Nguyễn Quang Ngọc, “Mấy nhận xét về kết cấu kinh tế\
\ của \r\nmột số làng thương nghiệp ờ vùng đồng bằng Bắc Bộ thế kỳ \r\nXVIII-XIX”,\
\ Tạp chí Nghiên cứu Lịch sứ, số 5 (218), 1984.\r\n324. Nguyễn Quang Ngọc, Phan\
\ Đại Doãn, “Mấy ý kiến về hoạt \r\nđộng thương nghiệp ở nông thôn đồng bằng Bắc\
\ Bộ thế kỷ \r\nXVIII-XIX (hiện tượng và bản chất)”, Tạp chí Nghiên cứu\r\nLịch\
\ sử, số 5 (224), 1985.\r\n325. Nguyễn Quang Ngọc, “Thêm vài ý kiến về Tam Điệp”,\
\ Tạp \r\nchí Nghiên cứu Lịch sử, số 1 (244), 1989.\r\n326. Nguyễn Quang Ngọc,\
\ về một số làng buôn ở Đồng bàng Bắc \r\nBộ thế kỳ XVIII-XIX, Hội Sừ học Việt\
\ Nam, 1993.\r\n327. Nguyễn Quang Ngọc, Vũ Văn Quân, “Tư liệu về nguồn gốc \r\n\
chức năng và hoạt động cùa đội Hoàng Sa”, Tạp chí Khoa\r\nhọc xã hội, Đại học\
\ Quốc gia, t.XIV, số 3, 1998, ư. 10-20.\r\n328. Nguyễn Quang Ngọc, “Bảo vệ chủ\
\ quyền ưên Biển Đông: \r\nmột hoạt động nổi bật của vương triều Tây Sơn”, Tạp\
\ chí \r\nLịch sử quân sự, số 1, 1999, tr. 15-18.\r\n329. Nguyễn Quang Ngọc (Chủ\
\ biên), Tiến trình lịch sứ Việt Nam,\r\nNxb. Giáo dục, Hà Nội, 2001.\r\n330.\
\ Nguyền Quân, Phan cẩm Thượng, Mỹ thuật cùa người Việt,\r\nNxb. Mỹ thuật. Hà\
\ Nội. 1989.\r\n331. Nguyễn Tài Thư (Chủ biên), Lịch sử tư tưởng Việt Nam, 2\r\
\ntập, Nxb. Khoa học xã hội, Hà Nội, 1993.\r\n332. Nguyễn Tài Thư, Nho học và\
\ Nho học ớ Việt Nam: Một số lý\r\nluận và thực tiễn, Nxb. Khoa học xã hội, Hà\
\ Nội, 1997.\r\n333. Nguyễn Tưòmg Phượng, Binh chế Việt Nam qua các thời đại,\r\
\nNgày Mai, 1950."
- "Ba Thục, Kinh Sở, Ngô Việt…). Kết thúc cuộc \"Hán Sở tranh hùng\", nhà Hán\r\n\
đã thống nhất đất nước Trung Hoa từ bắc xuống nam (tiền bắc hậu nam) và phát\r\
\ntriển đất nước theo một trật tự ngược lại: tiền nam hậu bắc\".\r\nCó thể hình\
\ dung cơ cấu của văn hóa Trung Hoa như sau: \r\nVĂN HOÁ\r\nTRUNG\r\nHOA\r\n=\r\
\nVăn hoá lưu vực sông Hoàng Hà\r\n+\r\nVăn hoá nông\r\nnghiệp lúa nước\r\nĐông\
\ Nam Á\r\nVăn hoá du\r\nmục Tây Bắc +\r\nVăn hoá nông\r\nnghiệp khối Trung\r\n\
Nguyên\r\nMối liên hệ và sự tác động qua lại giữa văn hóa Việt Nam với Trung Hoa,\r\
\ngiữa văn hóa phương Bắc cổ đại với văn hóa phương Nam cổ đại (trong đó có\r\n\
văn hóa Nam – Á - Bách Việt) có thể trình bày trong bảng 1.5.\r\nVĂN HOÁ\r\nP.BẮC\
\ CỔ ĐẠI\r\nVĂN HOÁ PHƯƠNG NAM (= Đ.N.Á cổ đại)\r\nVăn hoá Nam-Á (Bách Việt)\r\
\nVăn hóa vùng lưu\r\nvực sông Hoàng\r\nHà\r\nVăn hóa vùng lưu\r\nvực sông Dương\r\
\nTử\r\nVăn hóa vùng lưu\r\nvực s. Hồng, s.\r\nMã\r\nVăn hóa miền\r\nTrung và\
\ đồng\r\nbằng s. Mê Kông\r\nVĂN HOÁ TRUNG HOA VĂN HOÁ VIỆT NAM\r\nBảng 1.5: Quan\
\ hệ cội nguồn giữa văn hóa Việt Nam và Trung Hoa\r\nBài 3: TIẾN TRÌNH VĂN HÓA\
\ VIỆT NAM\r\nTiến trình văn hóa Việt Nam có thể chia thành 6 giai đoạn: văn hóa\
\ tiền\r\nsử, văn hóa Văn Lang - Âu Lạc, văn hóa thời chống Bắc thuộc, văn hóa\
\ Đại\r\nViệt, văn hóa Đại Nam và văn hóa hiện đại. Sáu giai đoạn này tạo thành\
\ ba lớp:\r\nlớp văn hóa bản địa, lớp văn hóa giao lưu với Trung Hoa và khu vực,\
\ lớp văn\r\nhóa giao lưu với phương Tây.\r\n3.1. Lớp văn hóa bản địa\r\n28\r\n\
Downloaded by Tu?n ?ào Minh ([email protected])\r\nlOMoARcPSD|49704028"
- "trái), và hình bán nguyệt (đôi dưới, phải). Trước mắt ta là sự hòa hợp tuyệt\
\ vời\r\ncủa cái động (vật nhau) trong thế tĩnh của ba hình hình học với những\
\ cạnh đáy\r\nvững vàng cho thấy sự ngang sức ngang tài của các chàng trai; sự\
\ vận động liên\r\ntục của cơ bắp như dừng lại. Hai người chờ vật được khuôn lại\
\ trong hai hình\r\nchữ nhật đứng tạo nên cảm giác co ro bất tận trong cái rét\
\ của lễ hội đầu xuân.\r\n4.1.3. Thủ pháp mô hình hóa đã tạo nên một nền nghệ\
\ thuật trang trí và\r\nnhiều mô hình mang tính triết lí sâu sắc.\r\nBộ Tứ Linh\
\ (Hình 4.20a) với long (rồng) biểu trưng cho uy là nam tính; li\r\n(= long mã)\
\ hoặc lân (kì lân, con vật tưởng tượng đầu sư tử, mình nai, đuôi trâu,\r\n131\r\
\nDownloaded by Tu?n ?ào Minh ([email protected])\r\nlOMoARcPSD|49704028\r\
\năn cỏ, rất hiền lành - hình 4.20b) biểu trưng cho ước vọng thái bình, quy (rùa)\r\
\nhiểu tượng cho sự sống lâu và phượng (phụng) biểu tượng cho nữ tính. Rồng -\r\
\nPhượng biểu tượng cho hạnh phúc lứa đôi (ở Trung Hoa hiên tượng này là\r\n“loan-phượng”:\
\ loan là con đực, phượng là con cái). Đồ án trang trí RỒNG phổ\r\nbiến đến mức\
\ phản ánh những đặc trưng cửa từng thời đại. Rồng thời Hùng\r\nvương, thời Lí,\
\ Trần, Hồ, Lê, Mạc, Nguyễn – mỗi thời có những nét đặc thù\r\nriêng tương ứng\
\ với thời đại của mình.\r\nTứ linh cộng thêm ngư-phúc-hạc-hổ thì thành BÁT VẬT.\
\ Ngư (Cá) gắn\r\nvới truyền thuyết \"cá hóa rồng\" biểu tượng cho sự thành đạt.\
\ Chữ phúc là “sự tốt\r\nlành, may mắn” đồng âm và viết gần giống với chữ bức\
\ nghĩa là \"con dơi\", vì"
- source_sentence: Nhiệm vụ quan trọng nhất của các nước công nghiệp chủ nghĩa châu
Âu và Nhật Bản sau chiến tranh thế giới thứ hai là gì?
sentences:
- "Dupuis phái tự mình hành động. Tháng 10-1872, Dupuis đi Hương \r\nCảng và Thượng\
\ Hải mua pháo thuyền và đạn dược, mộ quân lính,\r\n1. Đó là các cuộc thám hiểm\
\ cùa phái đoàn Doudard de Lagrée và Francis \r\nGamier vào những năm từ 1866\
\ đến 1870.\r\n2. Nguyễn Phan Quang (1949), Việt Nam thế ky XIX (1802-1884), Nxb.\
\ \r\nThành phố Hồ Chí Minh, tr. 321.\r\n159\r\nLỊCH SỪ VIỆT NAM - TẬP 6\r\nrồi\
\ đến tháng 11 năm đó thì kéo nhau về Bắc Kỳ. Cùng lúc đó, bọn \r\nthực dân hiếu\
\ chiến ở Nam Kỳ cũng lợi dụng việc triều đình Huế \r\nyêu cầu đưa ra Bắc tiễu\
\ trừ giặc biển để phái tàu chiến ra tiếp tay \r\ncho Dupuis. Cậy có lực lượng\
\ mạnh, Dupuis buộc Kinh lược sứ Lê \r\nTuấn trong vòng hai tuần phải xin triều\
\ đình Huế cho phép hắn \r\nđược mượn đường đi lên Vân Nam. Nhung hạn 2 tuần chưa\
\ hết và \r\ngiấy phép cũng chưa có mà Dupuis đã nổ súng, rồi tự tiện kéo đoàn\
\ \r\ntàu vào Cửa cấm (Hải Phòng) ngược sông Hồng lên Hà Nội (ngày \r\n22-12-1872).\
\ Theo sử nhà Nguyễn thì ngày 2-12-1872, Dupuis “từ\r\nHài Dương đi đen Bắc Ninh,\
\ Hà Nội, các quan tình và quân thứ 2-\r\n3 lần biện bác ngăn trở không cho đi,\
\ nhưng chúng không nghe\r\nTrong khoảng thời gian từ năm 1872 đến năm 1873, Dupuis\
\ đã ỷ \r\nthế quân Pháp và triều đình nhà Thanh, trắng trợn xâm phạm chủ \r\n\
quyền Việt Nam, liên tiếp gây ra nhiều vụ khiêu khích, cướp phá \r\nđối với nhân\
\ dân dọc hai bờ sông, tấn công các đồn bốt của triều \r\nđình nhà Nguyễn.\r\n\
Trước hành động ngang ngược cùa Dupuis, quân dân Hà Nội \r\nmặc dù chưa có lệnh\
\ triều đình nhung vẫn tích cực đề phòng. Lệnh"
- "hội loài người nói chung hay cùa một quốc gia, một dân tộc nói \r\nriêng. Nghiên\
\ cứu lịch sử là nhằm tìm hiểu những sự kiện xảy ra \r\ntrong quá khứ để từ đó\
\ rút ra các bài học kinh nghiệm cho hiện tại \r\nvà tương lai. Nghiên cứu và\
\ biên soạn lịch sừ, vì vậy, trở thành một \r\nyêu cầu bức thiết của mọi quốc\
\ gia, dân tộc. Phạm Công Trứ, nhà \r\nchính trị danh tiếng, nhà sử học sống ở\
\ thế kỳ XVII, trong bài Tựa\r\nsách Đại Việt sử ký bản kỷ tục biên viết: \"Vì\
\ sao mà làm quốc sử?\r\nVĩ sử chù yếu là để ghi chép sự việc. Có chinh trị cùa\
\ một đời tất\r\nphải có sử của một đời. Mà ngòi bút chép sử giữ nghị luận rất\r\
\nnghiêm, ca ngợi đời thịnh trị thì sáng tỏ ngang với mặt trời, mặt\r\ntrăng,\
\ lên án kẻ loạn tặc thì gay gắt nhu sương thu lạnh buốt,\r\nngười thiện biết\
\ có thể bắt chước, người ác biết có thể tự răn, quan\r\nhệ đến việc chính trị\
\ không phải là không nhiều. Cho nên làm sử là\r\ncốt để cho được như thế\"'.\r\
\nViệt Nam là một dân tộc có lịch sử lâu đời. Việt Nam cũng là \r\nmột dân tộc\
\ yêu sử và có rất nhiều người ham thích tìm tòi, nghiên \r\ncứu và biên soạn\
\ lịch sử. Đã có nhiều công trình lịch sử được công \r\nbố, không chi do các cơ\
\ quan, tổ chức chuyên nghiên cứu biên \r\nsoạn, mà còn do cá nhân người yêu sử\
\ thực hiện... Điều này vừa có \r\nmặt tích cực, lại cỏ mặt tiêu cực. Tích cực\
\ vì sẽ góp phần giúp nhân \r\ndân hiểu thêm về lịch sử nước nhà, nhưng cũng chứa\
\ đựng yếu tố \r\ntiêu cực là dễ dẫn tới những hiểu biết phiến diện, sai lầm về\
\ lịch \r\nsử... đôi khi đồng nhất truyền thuyết với lịch sử?"
- "LỊCH SỪ VIỆT NAM - TẬP 11\r\ngiầu mạnh hcm nhờ chiến tranh. Những nước bại trận\
\ như Đức, Ý, \r\nNhật thì kiệt quệ. Song dù thắng hay bại, sự kết thúc chiến\
\ tranh đặt \r\ncho mỗi nước những yêu cầu cấp bách cần giải quyết, tạo nên \r\
\nnhững đặc trưng kinh tế - xã hội ở nhóm nước này.\r\nSau chiến tranh thế giới,\
\ những nưóc công nghiệp chủ nghĩa \r\nchâu Âu và Nhật Bản đều bị chiến tranh\
\ tàn phá nặng nề. Nhiệm vụ \r\nquan trọng của họ ỉà hàn gắn vết thương chiến\
\ tranh, khôi phục \r\nkinh tế, ổn định đời sống xã hội. Đối với Mỹ, nhiệm vụ\
\ chủ yếu là \r\nphải chuyển hướng vận hành kinh tế từ một nền kinh tế phục vụ\
\ \r\nquân sự thời chiến sang nền kinh tế thời bình.\r\nNhừng nét cơ bản của tình\
\ hình thế giới nêu trên đã tác động \r\nđến hầu hết các khu vực trên thế giới,\
\ đặc biệt là khu vực Châu Á \r\nvà Đông Nam Á, tạo điều kiện thuận lợi cho cuộc\
\ đấu tranh giải \r\nphóng của các dân tộc Đông Dương. Từ đầu những năm 1950,\
\ tình \r\nhình cách mạng ba nước Đông Dương chuyển biến nhanh chóng. \r\nVới\
\ cuộc đi thăm Trung Quốc, Liên Xô của Chủ tịch Hồ Chí Minh \r\nđầu năm 1950 và\
\ việc các nước xã hội chủ nghĩa công nhận và đặt \r\nquan hệ ngoại giao với Chính\
\ phủ Việt Nam Dân chủ Cộng hòa là \r\nmột thắng lợi ngoại giao vô cùng quan trọng.\
\ Thắng lợi về ngoại \r\ngiao này đã chấm dứt thời kỳ chiến đấu đom độc, hầu như\
\ bị cách ly \r\nvới bên ngoài và từ đó tiếp nhận được sự đồng tình về chính trị\
\ và \r\nsự viện trợ về vật chất.\r\nVới sự giúp đỡ của Liên Xô, Trung Quốc và\
\ các nước xã hội"
- source_sentence: Chức năng của quan Đốc học trong việc quản lý giáo dục ở các tỉnh
là gì?
sentences:
- "Định, Phú Yên, Biên Hoà, Gia Định, Vĩnh Long, Định Tường, An \r\nGiang đều đặt\
\ mỗi tỉnh một quan Đốc học coi việc học chính trong \r\ntinh. Các tỉnh từ Quảng\
\ Trị, Quảng Bình, Hà Tĩnh, Nghệ An, \r\nThanh Hoá, Ninh Bình, Nam Định, Hà Nội,\
\ Hưng Yên, Hải Dương, \r\nSơn Tây, Bắc Ninh cũng đều đật chức Đốc học. Tinh nào\
\ khuyết \r\nchức Đốc học thì đặt Thự đốc học tạm quyền đốc học một thời gian\
\ \r\nđổ phụ trách, đôn đốc việc học trong tỉnh.\r\nCác tỉnh Khánh Hoà, Bình Thuận,\
\ Hà Tiên, Quảng Yên, Hưng \r\nHoá, Tuyên Quang, Thái Nguyên, Lạng Sơn, Cao Bằng,\
\ do số học \r\nsinh ít nên đến cuối thời Thiệu Trị (1847) vẫn chưa đặt chức Đốc\
\ học.\r\nTheo lệ Nhà nước chế cấp ấn quan phòng giao cho Đốc học lo \r\nviệc\
\ học chính trong địa hạt của tinh sờ tại, trong đó có việc xây \r\ndựng trường\
\ sở ở tinh, phù, hoặc huyện, châu; sắp xếp các thày \r\ngiáo và tuyển chọn học\
\ sinh vào học ở các trường. Những công \r\nviệc licn quun đén việc học đểu có\
\ sự phối hựp giữa quan Đốc hục \r\nvới các viên giữ chức Giáo thụ ở các phủ và\
\ Huấn đạo ờ các huyện, \r\nchâu. Một bộ máy giáo dục được tổ chức chặt chẽ theo\
\ ngành dọc \r\ntừ tinh đến phủ, huyện, châu; tổng (ở tổng có Tổng giáo) để theo\
\ \r\ndõi, đôn đốc việc giảng dạy và học tập, đã góp phần đẩy mạnh hom \r\nviệc\
\ giáo dục ở những triều vua Nguyễn nửa đầu thế kỳ XIX. Những \r\nthành tích của\
\ giáo dục bấy giờ biểu hiện rõ nhất ở việc Nhà nước \r\ncứ 3 năm lại mở một kỳ\
\ thi Hương ờ một số tinh thuộc Bác Kỳ (Nam \r\nĐịnh, Hài Dương, Thăng Long);\
\ Nghệ An; kinh đô Huế; Trung Kỳ"
- "Trước tình hình thế giới và trong nước ngày càng khẩn trương, ngày 28 - I - 1941,\r\
\nlãnh tụ Nguyễn Ái Quốc về nước triệu tập Hội nghị lần thứ 8 Ban Chấp hành\r\n\
Trung ương Đảng Cộng sản Đông Dương. Hội nghị họp tại Pác Bó (Cao Bằng) từ\r\n\
ngày 10 đến ngày 19 - 5 - 1941.\r\nHội nghị chủ †rương trước hết phởi giỏi phóng\
\ cho được cóc dôn tộc\r\nĐông Dương ro khỏi éch Phớp - Nhột. Hội nghị quyết định\
\ tiếp tục tạm\r\ngóc khổu hiệu “Đónh đổ địa chủ, chia ruộng đốt cho dôn còy”\
\ thay bằng\r\ncóc khổu hiệu “Tịch thu ruộng đốt của đế quốc vò Việt gian chia\
\ cho dên\r\ncòy nghèo, giởm †ô, giỏm tức, chia lợi ruộng công”, tiến tới thực\
\ hiện\r\n“Người còy có ruộng”. Hội nghị chủ trương †hònh lộp Việt Nơm độc lập\r\
\nđồng minh (gọi tốt lò Việt Minh) bao gồm céc †ổ chức quồn chúng, lốy\r\ntên\
\ lò Hội Cứu quốc nhồm : “Liên hiệp hết thỏy cóc giới đồng bèo yêu\r\nnước, không\
\ phôn biệt giòu nghèo, giò trẻ, gới trai, không phôn biệt tôn\r\ngiáo vò xu hướng\
\ chính trị, đặng cùng nhau mưu cuộc dôn tộc giỏi phóng\r\nvò sinh tồn” °°,\r\n\
\r\nMặt trận Việt Minh chính thức thành lập ngày 19 - 5 - 1941. Chỉ sau một thời\r\
\ngian ngắn, tổ chức này đã có uy tín và ảnh hưởng sâu rộng trong nhân dân. Sau\
\ Hội\r\nnghị Trung ương, lãnh tụ Nguyễn Ái Quốc đã gửi thư kêu gọi đồng bào cả\
\ nước\r\nđoàn kết thống nhất đánh đuổi Pháp - Nhật."
- "\"Chính sự ngày một đổ nát, đói kém xảy ra luôn luôn. Nhân dân cùng\r\nquân,\
\ khốn khổ, giặc cướp nổi lên ở nhiễu nơi\".\r\n(Khâm định Việt sử thông giám\
\ cương mục)\r\n\r\nỞ Nghệ An, Thanh Hoá, Ninh Bình,... dân nghèo nổi dậy đấu\
\ tranh. Trong\r\ntình hình đó, một số thế lực phong kiến ở các địa phương lại\
\ đánh giết lẫn\r\nnhau, quấy phá nhân dân và chống lại triều đình. Nhà Lý phải\
\ dựa vào thế lực\r\nhọ Trần để chống lại các lực lượng nổi loạn nên đã tạo điều\
\ kiện và thời cơ cho\r\nhọ Trần buộc Chiêu Hoàng (vua cuối cùng của nhà Lý) phải\
\ nhường ngôi cho\r\nTrần Cảnh vào tháng 12, năm Ất Dậu (đâu năm 1226).\r\n\r\n\
(1) Việc thổ mộc : việc làm nhà cửa, chùa, đền, đào sông, hồ..."
- source_sentence: Thiệu Trị đã xử lý trường hợp của Lý Văn Phức và việc người Pháp
bắt giữ thuyền quân đi tuần biển của Việt Nam ra sao?
sentences:
- "hóa; thuế độc quyền; thué điền thổ...\r\nTheo những con số thống kê chính thức\
\ thì các loại thuế trên \r\nđều tăng lên đáng kể, khoảng từ ba đến hơn ba lần\
\ vào năm 1945 \r\n(số dự thu) so với năm 1939 (số thực thu) như sau:\r\nBảng\
\ 29: Thu nhập từ một sổ loại thuế ở Đông Dương \r\ntrong các năm 1939 và 19453\r\
\nĐom vị: nghìn đồng\r\nThuế 1939 1945\r\nThuế tiêu thụ và vận chuyển hàng hoá\
\ 20.655.000 58.265.000\r\nThuế muối, rượu, thuốc phiện, diêm, pháo,\r\nthuốc\
\ lá\r\n24.694.000 87.000.000\r\nThuế điền thổ, trước bạ 11.821.000 28.625.000\r\
\nvề thuốc phiện, do việc nhập khẩu bị ngừng, Pháp khuyến khích \r\nnhân dân thượng\
\ du trồng loại cây này nên số thuốc phiện sản xuất \r\nđược ngày một tăng: năm\
\ 1940: 7.560kg; nãm 1941: 17.344kg; năm\r\n1. Annuaire statistique de V Union\
\ f,rariỊaise Outre- mer 1939-1946, tr. K -\r\n90-93.\r\n2, 3. Annuaire statistique\
\ de runion firanẹaise Outre - mer 1939-1946, tr.\r\nK-90.\r\n552"
- "Chương I. Chính sách thuộc địa của Pháp..\r\nbộ đồng bào các dân tộc thiểu số.\
\ về phương diện này, chính quyền \r\nthuộc địa còn muốn đi xa hơn là cố định\
\ đồng bào vào một không \r\ngian nhất định, rồi đưa họ đến với chế độ sở hữu\
\ ruộng đất - chế độ \r\nsở hữu tập thể và ấn định cho họ một chế độ thuế khóa.\r\
\nNhư vậy, “chính sách thâm nhập” có xuất phát điểm là chính \r\nsách “chia đế\
\ trf' và mục tiêu là tách các dân tộc thiểu số ra khỏi \r\ndân tộc Kinh, dùng\
\ dân tộc nọ chống lại dân tộc kia và nhằm một \r\nmục đích cao hơn là từ chinh\
\ phục, khuất phục về chính trị để tiến \r\nsang khai thác, bóc lột về đất đai,\
\ nhân công và thuế khóa của các \r\nđồng bào.\r\n7. Một số “cải cách” xã hội\
\ khác liên quan đến nông dân và\r\ncông nhân\r\nLiên quan đến nông dân, trong\
\ bài diễn văn về Tinh hình Đông\r\nDương và tuyên bo cải cách vào tháng 9/19301,\
\ Pierre Pasquier nêu \r\nra những vấn đề như: thi hành luật điền thổ, giúp nông\
\ dân Nam Kỳ \r\nthế chấp ruộng đất để vay tín dụng ngân hàng; dẫn thủy nhập điền,\
\ \r\nlàm thuỷ lợi để tăng diện tích canh tác, cải tiến kỹ thuật trồng trọt; \r\
\ngiúp nông dân thăng tién về sờ hữu ruộng đất (từ người không có \r\nđất lên\
\ tiểu điền chủ); mở rộng việc nhượng đất, khẩn hoang ở \r\nnhững vùng rừng núi\
\ ở Bắc và Trung Kỳ cũng như ở phía tây và \r\nnam Nam Kỳ; quy định lại chế độ\
\ lĩnh canh để \"hạn ché bớt sự bóc\r\nlột cùa địa chù đoi với tá điền”.\r\nTriển\
\ khai những “cải cách” này, Pierre Pasquier cho tiếp tục \r\nxây dựng các công\
\ trình thuỷ nông, rồi thành lập Hội đồng Khẩn"
- "theo vài mươi người, đeo gươm, đeo súng, đến thẳng ngay công \r\nquán, đưa ra\
\ một lá thư của nước Pháp bằng chữ Hán, lời lẽ ngang \r\nngược. Lý Văn Phức không\
\ nhận thư, Lạp Biệt Nhĩ quát to doạ nạt, \r\nđể lại thư xuống ghế rồi đi. Lý\
\ Văn Phức và Nguyễn Đình Tân bàn \r\nvới nhau rằng: \"Nhận lấy thư là có tội,\
\ mà đốt thư đi cũng có tội, \r\nkhông gì bằng cho chạy trạm về đệ tâu lên\".\
\ Lý Văn Phức về Kinh,\r\n1. Thực lục, tập VI, sđd, tr. 301.\r\n492\r\nChương\
\ VII. Quan hệ đối ngoại\r\nThiệu Trị giận là làm mất quốc thể, sai vệ cẩm y đóng\
\ gông đem \r\ngiam ở Tà đãi lậu, bắt giải chức, giao cho đình thần bàn.\r\nKhi\
\ ấy, bọn Pháp ngày thường lên bờ, ngông nghênh đi lại các \r\nnơi giao tiếp với\
\ dân đi đạo. Những thuyền quân đi tuần biển bị \r\nchúng bắt giữ lại ở cừa biển\
\ và cướp lấy buồm thuyền và dây buộc \r\nthuyền cùa 5 chiếc thuyền bọc đồng ở\
\ Kinh phái đi Nam (Kim \r\nƯng, Phấn Bằng, Linh Phượng, Thọ Hạc, Vân Bằng) đậu\
\ ở vụng \r\nTrà Sơn, đối diện vói chiến thuyền Pháp.\r\nViệc báo lên, Thiệu Trị\
\ sai ngay Đô thống Hữu quân Mai Công \r\nNgôn, Tham tri Bộ Hộ Đào Trí Phú đem\
\ biền binh 3 vệ Vũ lâm, Hổ \r\noai, Hùng nhuệ đến Quảng Nam cùng với lực lượng\
\ thủy, bộ tại \r\nchỗ tổ chức bố phòng. Thiệu Trị truyền chi căn dặn Mai Công\
\ \r\nNgôn và Đào Trí Phú rằng: \"Người Tây dương nếu đã sợ uy, thu \r\nhình,\
\ thì ta không nên tự động thủ trước; nếu chúng sinh chuyện \r\ntrước, thì đốc\
\ sức thành đài cùng biền binh các hiệu thuyền và \r\nthuyền đồng do Kinh phái\
\ đi, ngoài hợp, trong ứng, lập tức đánh"
- source_sentence: Gia Cát Lượng đã giúp ai trong việc quản lý nước Thục?
sentences:
- "phải trông coi mọi việc, giúp Thành Vương đến lúc trưởng thành. \r\n4\r\n Hoắc\
\ Quang giữ chức Đại tư mã tướng quân, phò Hán Chiêu Đế lúc lên ngôi mới 9 tuổi.\
\ \r\n5\r\n Gia Cát Lượng tức Khổng Minh, là thừa tướng của Chiêu Đế Lưu Bị nước\
\ Thục đời Tam Quốc. Lưu Bị chết, con là Lưu Thiện nối \r\nngôi, tức Thục Hậu\
\ chúa, mọi việc nước, việc quân đều phải trông cậy vào Gia Cát Lượng. \r\n6\r\
\n Tô Hiến Thành là Thái úy triều Lý Cao Tông, nhận di mệnh Cao Tông phò vua nhỏ\
\ là Long Cán lên nối ngôi mới 3 tuổi. \r\n7\r\n Tứ phụ: nghĩa là bốn viên đại\
\ thần giúp vua khi mới lên ngôi. \r\n8\r\n Chỉ Thuận Tông. \r\n9\r\n Xích chủy:\
\ nghĩa là mõm đỏ, miệng đỏ, hay đỏ mỏ. Xích chủy hầu là loài đỏ mỏ ám chỉ Lê\
\ Quý Ly. \r\n10 Bạch kê: nghĩa là gà trắng. Nghệ Tông sinh năm Tân Dậu, tức năm\
\ gà. Tân thuộc hành kim, loài kim sắc trắng. Vì thế \"bạch kê\" \r\nám chỉ Nghệ\
\ Tông. \r\n11 Chữ vương? ở trong lòng chữ khẩu? là chữ \"quốc\"?. \r\n12 Theo\
\ tục nhà Trần, hằng năm vào ngày mồng 4 tháng 4, vua hội họp bề tôi làm lễ tuyên\
\ thệ ở đền Đồng Cổ. (Xem bản kỷ, quyển \r\n5, Kiến Trung năm thứ 3, 1277). \r\
\n13 Chỉ Quý Ly. \r\n288 Đại Việt Sử Ký Toàn Thư - Bản Kỷ - Quyển VIII \r\nQuý\
\ Ly bỏ mũ, rập đầu khóc lóc từ tạ, chỉ trời vạch đất thề rằng: \r\n\"Nếu thần\
\ không biết dốc lòng trung, hết sức giúp Quan gia để truyền đến con cháu về sau\
\ thì \r\ntrời sẽ ghét bỏ thần\". \r\nQuý Ly lại nói: \"Lúc Linh Đức Vương làm\
\ điều thất đức, nếu không nhờ oai linh bệ hạ thì thần đã"
- "éo, xênh xang lạ hom cả\", và gánh xiếc của BẮc thành trổ tài dịp Đại \r\nkhánh\
\ \"Ngũ tuần\" của vua: \"4 đứa leo dây, đứa trẻ lộn dây, đứa trẻ \r\nmúa trên\
\ bàn tay 2 đứa\".\r\nNhững định chế về tổ chức và hoạt động nghệ thuật của nhà\
\ \r\nNguyễn đã có tác dụng quan ữọng kích thích các loại hình vãn nghệ \r\ndân\
\ gian phát triển cả về số lượng lẫn chất lượng. Trong các đợt biểu \r\ndiễn ở\
\ Kinh đô, trước yêu cầu thưởng lãm nghiêm ngặt và cao hơn \r\nđịa phương, các\
\ nhà viết kịch bản. đạo diễn, diễn viên phải trau dồi để \r\nnâng cao năng lực\
\ sáng tác, dàn dựng và kỹ năng biểu diễn.\r\n2. Nghệ thuật dân gian\r\nSinh hoạt\
\ văn nghệ dân gian trong các làng quê cũng phát triển. \r\nỞ Bắc Kỳ, Bắc Trung\
\ Kỳ, hát ả đào rất phổ biến. Bên cạnh đó là \r\ncác thể loại dân ca: hát Xoan\
\ ở Phú Thọ, Quan họ Bắc Ninh, hát \r\nSli, Then ở Lạng Sơn, hát Ví dặm, Phường\
\ vải ở Nghệ An, Hà \r\nTĩnh. Ở các tinh trung du và đồng bằng Bắc Bộ, Thanh Hóa,\
\ chèo \r\nsân đình mang tính trào lộng nở rộ. Thể loại trò hài, xiếc ở Bắc Kỳ\
\ \r\ncũng thu hút đông đảo khán giả.\r\n639"
- "Tây. Ngoài cơ sờ đúc súng cũ của tiên triều, năm 1825 vua Minh \r\nMệnh mờ thêm\
\ sáu xưởng nữa. vốn cần cù và ham học hỏi sáng \r\ntạo, những người thợ quân\
\ giới đã được \"thứ súng tay nạp thuốc nổ \r\nmạnh theo kiểu Tây dương\". Vào\
\ những năm cuối triều Minh \r\nM ệnh, họ đã đúc 15 cỗ đại pháo X ung tiêu băng\
\ đồng và hai cỗ \r\nsúng lớn Chấn hải, loại đại pháo lợi hại trong thủy chiến\
\ phương \r\nTây. Sau đó, lại xuất xưởng tiếp 30 cỗ Chấn hải. Năm 1829, quản \r\
\nkho Hải Dương là Tôn Thất Thiện cùng với 100 lính Chấn cơ chế \r\nra cối gỗ\
\ chạy bàng sức nước ở khe suối để giã, luyện thuốc súng. \r\nDụng cụ này là xe\
\ \"Thủy hỏa ký tế\", và những năm sau được phổ \r\ncập trong quân ngũ. Từ vũ\
\ khí phương Tây, người Đại Nam đã tự \r\ntìm hiểu từng chi tiết để chế tạo thước\
\ đo ngắm bắn, thước kiểm tra \r\nthuốc súng. Trong bảy năm ờ ngôi, vua Thiệu\
\ Trị đúc 9 cỗ súng \r\nbàng đồng hiệu là \"Thần uy phục viễn đại tướng quân\"\
, cỗ to nhất \r\nlà 10.706 cân, cỗ nhỏ nhất là 10.222 cân, tổng cộng là 93.829\
\ cân.\r\n649\r\nLỊCH SỬ VIỆT NAM - TẬP 5\r\nVà ba cỗ súng hiệu \"Bảo Đại định\
\ công an dân hòa chúng thượng \r\ntướng quân\", mỗi cỗ trên 14.500 cân, tổng\
\ cộng là 43.620 cân1.\r\nĐe tạo điều kiện cho quân thủy học tập, bộ Công cấp\
\ cho họ la \r\nbàn, thước đo nước, đồng hồ cát xem giờ của phương Tây. v ề khoa\
\ \r\nmục bắn súng thì lính thủy phải tập bắn súng điểu sang và đại bác. \r\n\
Minh Mệnh yêu cầu Hiệp biện Đại học sĩ lãnh Thượng thư bộ Binh \r\nTrương Đăng\
\ Quế đọc kỹ các sách và bản đồ thủy chiến \"Tây"
model-index:
- name: SentenceTransformer based on Alibaba-NLP/gte-multilingual-base
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: gte multilingual base 768
type: gte_multilingual_base_768
metrics:
- type: cosine_accuracy@1
value: 0.3972602739726027
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.6333333333333333
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.7132420091324201
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7817351598173516
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.3972602739726027
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.21111111111111108
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.142648401826484
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.07817351598173515
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.3972602739726027
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.6333333333333333
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.7132420091324201
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7817351598173516
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5921213055171655
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.5309868087265359
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.537969151887342
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: gte multilingual base 512
type: gte_multilingual_base_512
metrics:
- type: cosine_accuracy@1
value: 0.38767123287671235
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.6310502283105023
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.7095890410958904
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7821917808219178
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.38767123287671235
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.21035007610350073
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.14191780821917807
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.07821917808219177
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.38767123287671235
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.6310502283105023
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.7095890410958904
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7821917808219178
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5879636635574841
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.525339204174821
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.5318727014135456
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: gte multilingual base 256
type: gte_multilingual_base_256
metrics:
- type: cosine_accuracy@1
value: 0.3771689497716895
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.6146118721461187
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.6872146118721462
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7662100456621005
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.3771689497716895
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.20487062404870623
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.13744292237442923
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.07662100456621006
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.3771689497716895
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.6146118721461187
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.6872146118721462
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7662100456621005
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5736037026704126
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.5116503587736474
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.5189035063838257
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: gte multilingual base 128
type: gte_multilingual_base_128
metrics:
- type: cosine_accuracy@1
value: 0.36118721461187214
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.582648401826484
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.6502283105022831
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7342465753424657
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.36118721461187214
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.1942161339421613
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.1300456621004566
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.07342465753424657
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.36118721461187214
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.582648401826484
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.6502283105022831
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7342465753424657
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5465887777560341
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.4866068710589268
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.49427672079491064
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: gte multilingual base 64
type: gte_multilingual_base_64
metrics:
- type: cosine_accuracy@1
value: 0.3082191780821918
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.5146118721461187
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.5863013698630137
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.6621004566210046
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.3082191780821918
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.17153729071537288
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.11726027397260275
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.06621004566210045
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.3082191780821918
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.5146118721461187
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.5863013698630137
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.6621004566210046
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.4843188931282978
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.4275081539465107
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.4370689716929827
name: Cosine Map@100
---
# SentenceTransformer based on Alibaba-NLP/gte-multilingual-base
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [Alibaba-NLP/gte-multilingual-base](https://huggingface.co/Alibaba-NLP/gte-multilingual-base) on the csv dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [Alibaba-NLP/gte-multilingual-base](https://huggingface.co/Alibaba-NLP/gte-multilingual-base) <!-- at revision ca1791e0bcc104f6db161f27de1340241b13c5a4 -->
- **Maximum Sequence Length:** 8192 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- csv
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: NewModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("minhtuan7akp/gte-base-vietnamese-finetune-matryoshka")
# Run inference
sentences = [
'Gia Cát Lượng đã giúp ai trong việc quản lý nước Thục?',
'phải trông coi mọi việc, giúp Thành Vương đến lúc trưởng thành. \r\n4\r\n Hoắc Quang giữ chức Đại tư mã tướng quân, phò Hán Chiêu Đế lúc lên ngôi mới 9 tuổi. \r\n5\r\n Gia Cát Lượng tức Khổng Minh, là thừa tướng của Chiêu Đế Lưu Bị nước Thục đời Tam Quốc. Lưu Bị chết, con là Lưu Thiện nối \r\nngôi, tức Thục Hậu chúa, mọi việc nước, việc quân đều phải trông cậy vào Gia Cát Lượng. \r\n6\r\n Tô Hiến Thành là Thái úy triều Lý Cao Tông, nhận di mệnh Cao Tông phò vua nhỏ là Long Cán lên nối ngôi mới 3 tuổi. \r\n7\r\n Tứ phụ: nghĩa là bốn viên đại thần giúp vua khi mới lên ngôi. \r\n8\r\n Chỉ Thuận Tông. \r\n9\r\n Xích chủy: nghĩa là mõm đỏ, miệng đỏ, hay đỏ mỏ. Xích chủy hầu là loài đỏ mỏ ám chỉ Lê Quý Ly. \r\n10 Bạch kê: nghĩa là gà trắng. Nghệ Tông sinh năm Tân Dậu, tức năm gà. Tân thuộc hành kim, loài kim sắc trắng. Vì thế "bạch kê" \r\nám chỉ Nghệ Tông. \r\n11 Chữ vương? ở trong lòng chữ khẩu? là chữ "quốc"?. \r\n12 Theo tục nhà Trần, hằng năm vào ngày mồng 4 tháng 4, vua hội họp bề tôi làm lễ tuyên thệ ở đền Đồng Cổ. (Xem bản kỷ, quyển \r\n5, Kiến Trung năm thứ 3, 1277). \r\n13 Chỉ Quý Ly. \r\n288 Đại Việt Sử Ký Toàn Thư - Bản Kỷ - Quyển VIII \r\nQuý Ly bỏ mũ, rập đầu khóc lóc từ tạ, chỉ trời vạch đất thề rằng: \r\n"Nếu thần không biết dốc lòng trung, hết sức giúp Quan gia để truyền đến con cháu về sau thì \r\ntrời sẽ ghét bỏ thần". \r\nQuý Ly lại nói: "Lúc Linh Đức Vương làm điều thất đức, nếu không nhờ oai linh bệ hạ thì thần đã',
'Tây. Ngoài cơ sờ đúc súng cũ của tiên triều, năm 1825 vua Minh \r\nMệnh mờ thêm sáu xưởng nữa. vốn cần cù và ham học hỏi sáng \r\ntạo, những người thợ quân giới đã được "thứ súng tay nạp thuốc nổ \r\nmạnh theo kiểu Tây dương". Vào những năm cuối triều Minh \r\nM ệnh, họ đã đúc 15 cỗ đại pháo X ung tiêu băng đồng và hai cỗ \r\nsúng lớn Chấn hải, loại đại pháo lợi hại trong thủy chiến phương \r\nTây. Sau đó, lại xuất xưởng tiếp 30 cỗ Chấn hải. Năm 1829, quản \r\nkho Hải Dương là Tôn Thất Thiện cùng với 100 lính Chấn cơ chế \r\nra cối gỗ chạy bàng sức nước ở khe suối để giã, luyện thuốc súng. \r\nDụng cụ này là xe "Thủy hỏa ký tế", và những năm sau được phổ \r\ncập trong quân ngũ. Từ vũ khí phương Tây, người Đại Nam đã tự \r\ntìm hiểu từng chi tiết để chế tạo thước đo ngắm bắn, thước kiểm tra \r\nthuốc súng. Trong bảy năm ờ ngôi, vua Thiệu Trị đúc 9 cỗ súng \r\nbàng đồng hiệu là "Thần uy phục viễn đại tướng quân", cỗ to nhất \r\nlà 10.706 cân, cỗ nhỏ nhất là 10.222 cân, tổng cộng là 93.829 cân.\r\n649\r\nLỊCH SỬ VIỆT NAM - TẬP 5\r\nVà ba cỗ súng hiệu "Bảo Đại định công an dân hòa chúng thượng \r\ntướng quân", mỗi cỗ trên 14.500 cân, tổng cộng là 43.620 cân1.\r\nĐe tạo điều kiện cho quân thủy học tập, bộ Công cấp cho họ la \r\nbàn, thước đo nước, đồng hồ cát xem giờ của phương Tây. v ề khoa \r\nmục bắn súng thì lính thủy phải tập bắn súng điểu sang và đại bác. \r\nMinh Mệnh yêu cầu Hiệp biện Đại học sĩ lãnh Thượng thư bộ Binh \r\nTrương Đăng Quế đọc kỹ các sách và bản đồ thủy chiến "Tây',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Information Retrieval
* Datasets: `gte_multilingual_base_768`, `gte_multilingual_base_512`, `gte_multilingual_base_256`, `gte_multilingual_base_128` and `gte_multilingual_base_64`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | gte_multilingual_base_768 | gte_multilingual_base_512 | gte_multilingual_base_256 | gte_multilingual_base_128 | gte_multilingual_base_64 |
|:--------------------|:--------------------------|:--------------------------|:--------------------------|:--------------------------|:-------------------------|
| cosine_accuracy@1 | 0.3973 | 0.3877 | 0.3772 | 0.3612 | 0.3082 |
| cosine_accuracy@3 | 0.6333 | 0.6311 | 0.6146 | 0.5826 | 0.5146 |
| cosine_accuracy@5 | 0.7132 | 0.7096 | 0.6872 | 0.6502 | 0.5863 |
| cosine_accuracy@10 | 0.7817 | 0.7822 | 0.7662 | 0.7342 | 0.6621 |
| cosine_precision@1 | 0.3973 | 0.3877 | 0.3772 | 0.3612 | 0.3082 |
| cosine_precision@3 | 0.2111 | 0.2104 | 0.2049 | 0.1942 | 0.1715 |
| cosine_precision@5 | 0.1426 | 0.1419 | 0.1374 | 0.13 | 0.1173 |
| cosine_precision@10 | 0.0782 | 0.0782 | 0.0766 | 0.0734 | 0.0662 |
| cosine_recall@1 | 0.3973 | 0.3877 | 0.3772 | 0.3612 | 0.3082 |
| cosine_recall@3 | 0.6333 | 0.6311 | 0.6146 | 0.5826 | 0.5146 |
| cosine_recall@5 | 0.7132 | 0.7096 | 0.6872 | 0.6502 | 0.5863 |
| cosine_recall@10 | 0.7817 | 0.7822 | 0.7662 | 0.7342 | 0.6621 |
| **cosine_ndcg@10** | **0.5921** | **0.588** | **0.5736** | **0.5466** | **0.4843** |
| cosine_mrr@10 | 0.531 | 0.5253 | 0.5117 | 0.4866 | 0.4275 |
| cosine_map@100 | 0.538 | 0.5319 | 0.5189 | 0.4943 | 0.4371 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### csv
* Dataset: csv
* Size: 21,892 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 9 tokens</li><li>mean: 26.95 tokens</li><li>max: 103 tokens</li></ul> | <ul><li>min: 25 tokens</li><li>mean: 373.94 tokens</li><li>max: 596 tokens</li></ul> |
* Samples:
| anchor | positive |
|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Tính chất kiến trúc của đình làng triều Mạc được thể hiện qua những đặc điểm gì, như số gian, hình dạng, nội thất và cách bố trí không gian trong công trình?</code> | <code>Đình làng là công trình kiến trúc công cộng được dựng nên
<br>băng sự đóng góp của cải và công sức của cả cộng đồng làng xã.
<br>Ngoài chức năng là trụ sở hành chính của cả làng, ngôi đình còn là
<br>trung tâm sinh hoạt văn hóa làng xã, là nơi diễn ra các nghi lễ trọng
<br>đại trong dịp tế lễ thần Thành hoàng làng và tô chức hội hè hăng
<br>năm. Có thê nói, ngôi đình làng là nơi hội tụ sức mạnh của cả cộng
<br>đồng và là biểu trưng đặc sắc nhất của văn hóa làng xã.
<br>
<br>Trong các ngôi đình triều Mạc, Thân thành hoàng có lý lịch
<br>xuất thân khá phong phú. Tản Viên sơn thánh là vị thần có ảnh
<br>hưởng lớn ở xứ Đoài được thờ phụng ở đình Tây Đăng, Thanh Lũng
<br>và nhiều làng xã khác. Thần Cao Sơn, Quý Minh tương truyền là
<br>tướng tâm phúc của Hùng Vương được thờ ở đình làng Lỗ Hạnh.
<br>Dân làng Lỗ Hạnh còn thờ cả Phương Dung công chúa... Từ thế
<br>kỷ XYVI và các thế kỷ tiếp sau, Thần thành hoàng làng trở thành
<br>vị vua tỉnh thần ở các làng xã, tín ngưỡng thờ cúng Thân thành
<br>hoàng càng trở nên phong phú thê hiện qua lễ...</code> |
| <code>Nguyễn Khắc Nhu có vai trò gì trong khởi nghĩa toàn khu vực miền núi Bắc Kỳ của Việt Nam Quốc dân Đảng vào năm 1930?</code> | <code>bị nổ do bất cẩn. Do đó công việc bị phát hiện. Hai người phụ trách
<br>cơ quan chế bom là Đỗ Cương và Quản Trác trốn thoát. Nhiều binh
<br>lính và dân thường bị bắt. Công việc bạo động của Xứ Nhu không
<br>thành. Đúng lúc này Việt Nam Quốc dân Đảng vừa thành lập, cử
<br>người tới mời Xứ Nhu và Việt Nam Dân quốc gia nhập Việt Nam
<br>Quốc dân Đảng. Hầu hết các đồng chí của Xứ Nhu trở thành đảng
<br>viên của Việt Nam Quốc dân Đảng ở vùng Bắc Ninh, Bắc Giang.
<br>Do đó, Việt Nam Quốc dân Đảng mạnh lên về số lượng1. Cùng với
<br>việc phát triển đảng viên ở Bẳc Ninh, Bắc Giang, Việt Nam Quốc
<br>dân Đảng còn thiết lập nhiều cơ sở ở các tỉnh Thái Bình, Hải Dương,
<br>1. Nguyễn Khắc Nhu tức Xứ Nhu (1882-1930), người làng Song Khê, huyện
<br>Yên Dũng, tinh Bắc Giang. Với lòng yêu nuớc và ý chí chống Pháp,
<br>ông dự tính thành lập một tổ chức hoạt động công khai nhăm đào tạo
<br>tài năng cho đất nước lấy tên là "Hội Quốc dân dục tài”. Việc này
<br>không thành công, ông lại lập tổ chức bí mật nhăm bạo động lật đổ ách
<br>áp b...</code> |
| <code>Giá gạo tháng 3-1950 ở Liên khu IV là bao nhiêu đồng/tạ và có chênh lệch gì so với giá gạo ở Liên khu III và Liên khu Việt Bắc?</code> | <code>ngày càng tăng nhanh, nhất là ở Việt Bắc. Giá gạo tăng mạnh
<br>nhất, giá thực phẩm cũng tăng dần theo giá gạo. Giá các mặt hàng
<br>kỹ nghệ tăng chậm hơn. Giá hàng ngoại hóa hầu như không tăng
<br>vỉ trong vùng Pháp chiếm đóng, hàng ngoại hóa tính bằng tiền
<br>Đông Dương không tăng, hom nữa nhân dân cũng ít tiêu thụ hàng
<br>ngoại hóa vì bị cấm.
<br>1. Viện Kinh tế học, Kinh tế Việt Nam từ Cách mạng Tháng Tám đến..., Sách
<br>đã dẫn, tr. 238.
<br>2. Chuơng trình và báo cáo của Bộ Kinh tế về tình hình hoạt động năm 1950.
<br>Trung tâm lưu trữ quốc gia in, phông Phủ Thủ tướng, Hồ sơ số 1914.
<br>488
<br>Chương VI. Việt Nam dân chủ cộng hòa xây dựng..
<br>Giá gạo trong những tháng đầu năm 1950 so với cuối năm 1949
<br>có thay đổi, Liên khu IV (Thanh Hóa) giá tăng lên 154%; Liên khu
<br>III (Hà Đông - Hà Nam) giá tăng lên 153%; Liên khu Việt Bắc
<br>(Thái Nguyên) giá tăng lên 800%.
<br>Giá gạo ở Thái Nguyên từ 1.625 đồng/tạ lên 13.000 đồng/tạ
<br>(tăng 800%); ờ Phú Thọ từ 2.650 đồng/tạ lên 7.500 đồng/tạ (tăng
<br>283%). Mặt khác, ...</code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Evaluation Dataset
#### csv
* Dataset: csv
* Size: 21,892 evaluation samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 10 tokens</li><li>mean: 26.56 tokens</li><li>max: 108 tokens</li></ul> | <ul><li>min: 24 tokens</li><li>mean: 369.01 tokens</li><li>max: 559 tokens</li></ul> |
* Samples:
| anchor | positive |
|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Nguyễn Hoàng đã thực hiện những hành động gì để dần dần tách khỏi sự ràng buộc của họ Trịnh sau khi trở lại Thuận Quảng vào năm 1600, và những hành động này đã ảnh hưởng như thế nào đến mối quan hệ giữa hai dòng họ?</code> | <code>thẳng đối với họ Nguyễn. Trịnh Tùng đã lấy danh nghĩa vua Lê sai
<br>sứ giả là Thiêm đô ngự sử Lê Nghĩa Trạch đem sắc vào phủ dụ
<br>Nguyễn Hoàng và vẫn cho ở lại trấn thủ, hằng năm nộp thuế như
<br>cũ. Cùng với sắc của vua Lê, Trịnh Tùng có gửi thư kèm theo
<br>Chương ĩ. Sự phân liệt Đàng Trong - Đàng Ngoài...
<br>1, Toàn thư. quyển 17, tập IV, Sđd, tr. 200.
<br>2, Đại Nam thực lục, Tiền biên, quyển 1, tập I, Sđd, tr. 34.
<br>3, Đại Nam thực lục, Tiển biên, quyển 1, tập I, Sđd, tr. 35.
<br>39
<br>LỊCH SỬ VIỆT NAM - TẬP 4
<br>"khuyên giữ việc thuế cống". Nguyễn Hoàng sai sứ giả đáp lễ tạ on
<br>vua Lê và gửi thư cho Trịnh Tùng hẹn kết nghĩa thông gia, đem con
<br>gái là Ngọc Tú gả cho Trịnh Tráng (con Trịnh Tùng) lấy danh
<br>nghĩa hôn nhân để duy trì mối quan hệ bề ngoài giao hảo giữa hai
<br>dòng họ vốn có sẵn một mối thù địch.
<br>- Chính sách cùa họ Nguyễn từ khi Nguyễn Hoàng trở lại
<br>Thuận Quảng
<br>Năm 1600, Nguyễn Hoàng ròi được khỏi đất Bẳc trở về Thuận
<br>Quảng bắt đầu thực hiện một chính sách cai trị mói, dần dần tác...</code> |
| <code>Báo cáo của Ủy ban Kháng chiến hành chính Hà Nội về hoạt động giáo dục bù nhìn và tình hình các giáo sư trường Chu Văn An có nội dung gì?</code> | <code>Tài liệu tham khảo
<br>21. Báo cáo sô' 2 BC/I ngày 12-11-1949 và Báo cáo sô' 463
<br>BC/DB ngày 25-12-1949 của Ty Công an H à Nội. Trung
<br>tâm Lưu trữ Quốc gia III, phông Phủ Thủ tướng, Hồ sơ
<br>SỐ921.
<br>28. Báo “Le song” ngày 11-2-1949. Trung tâm Lưu trữ Quốc
<br>gia III, phông Phủ Thủ tướng, Hồ sơ sô' 2002.
<br>29. Báo cáo của u ỷ ban Kháng chiến hành chính Hà Nội vê
<br>hoạt động giáo dục bù nhìn và tình hình các giáo sư
<br>trường Chu Văn An. Trung tâm Lưu trữ Quốc gia III,
<br>phông Phủ Thủ tướng, Hồ sơ số 979.
<br>30. Báo cáo của Tổng Giám đốc Việt N am Công an vụ sô'
<br>122/NCB3 ngày 1-4-1951. Trung tâm Lưu trữ Quốic gia
<br>III, phông Phủ Thủ tướng, Hồ sơ sô' 979.
<br>31. Báo cáo thành tích về cống tác công an trong 8 năm kháng
<br>chiến (1946-1954) của Bộ Công an. Trung tâm Lưu trữ
<br>Quốc gia III, phông Phủ Thủ tướng, Hồ sơ sô' 927.
<br>32. Báo cáo một năm kháng chiến (12-1946 đến 12-1947) của
<br>UBKCHC Khu 12. Trung tâm Lưu trữ Quốc gia III, phông
<br>Phủ Thủ tướng, Hồ sơ sô" 2000.
<br>33. Báo cáo thành tích quăn sự trong 8 n...</code> |
| <code>Đặc điểm dân số của nước ta ảnh hưởng đến các ngành dịch vụ như thế nào và đòi hỏi những ngành dịch vụ nào cần được ưu tiên phát triển trong quá trình đô thị hóa?</code> | <code>— Trong các thành phố lớn thường hình thành các trung tâm giao dịch,
<br>thương mại. Đó là nơi tập trung các ngân hàng, các văn phòng đại diện
<br>của các công ti, các siêu thị hay các tổ hợp thương mại, dịch vụ lớn...
<br>Ở các thành phố lớn trên thế giới, thường dễ nhận thấy các trung tâm
<br>thương mại này do sự tập trung các ngôi nhà cao tầng, chọc trời. Một
<br>thành phố có thể có trung tâm thương mại chính và một số trung tâm
<br>thương mại nhỏ hơn, kết quả của sự phát triển đô thị.
<br>
<br>— Ở nước ta, các thành phố, thị xã thường có khu hành chính (phân
<br>“đô”) và khu buôn bán, dịch vụ (phân “thị'). Ở Hà Nội, Thành phố
<br>Hồ Chí Minh các trung tâm giao dịch, thương mại của thành phố đang
<br>được hình thành rõ nét.
<br>
<br>CÂU HỎI VÀ BÀI TẬP
<br>
<br>174
<br>
<br>1. Cho biết đặc điểm dân số của nước ta (đông, tăng còn tương đối
<br>nhanh, mức sống đang nâng lên và đô thị hoá đang phát triển với
<br>tốc độ nhanh hơn) có ảnh hưởng đến các ngành dịch vụ như thế
<br>nào ? Các đặc điểm đó đòi hỏi những ngành dịch vụ nào cần được
<br>ưu tiê...</code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 6
- `per_device_eval_batch_size`: 6
- `learning_rate`: 3e-06
- `num_train_epochs`: 2
- `warmup_ratio`: 0.05
- `bf16`: True
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 6
- `per_device_eval_batch_size`: 6
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 3e-06
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 2
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.05
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: True
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | Validation Loss | gte_multilingual_base_768_cosine_ndcg@10 | gte_multilingual_base_512_cosine_ndcg@10 | gte_multilingual_base_256_cosine_ndcg@10 | gte_multilingual_base_128_cosine_ndcg@10 | gte_multilingual_base_64_cosine_ndcg@10 |
|:------:|:----:|:-------------:|:---------------:|:----------------------------------------:|:----------------------------------------:|:----------------------------------------:|:----------------------------------------:|:---------------------------------------:|
| 0.0305 | 100 | 1.1057 | 0.7163 | 0.5609 | 0.5532 | 0.5375 | 0.4939 | 0.4168 |
| 0.0609 | 200 | 0.7976 | 0.5554 | 0.5724 | 0.5696 | 0.5491 | 0.5068 | 0.4351 |
| 0.0914 | 300 | 0.6724 | 0.4082 | 0.5819 | 0.5778 | 0.5592 | 0.5177 | 0.4453 |
| 0.1218 | 400 | 0.4439 | 0.3058 | 0.5868 | 0.5832 | 0.5643 | 0.5231 | 0.4558 |
| 0.1523 | 500 | 0.3544 | 0.2573 | 0.5873 | 0.5836 | 0.5631 | 0.5264 | 0.4597 |
| 0.1827 | 600 | 0.3483 | 0.2358 | 0.5897 | 0.5856 | 0.5690 | 0.5309 | 0.4679 |
| 0.2132 | 700 | 0.4737 | 0.2248 | 0.5917 | 0.5883 | 0.5767 | 0.5350 | 0.4747 |
| 0.2436 | 800 | 0.3216 | 0.2193 | 0.5899 | 0.5853 | 0.5712 | 0.5330 | 0.4734 |
| 0.2741 | 900 | 0.3239 | 0.2109 | 0.5918 | 0.5883 | 0.5719 | 0.5344 | 0.4712 |
| 0.3045 | 1000 | 0.3111 | 0.2065 | 0.5882 | 0.5856 | 0.5708 | 0.5331 | 0.4751 |
| 0.3350 | 1100 | 0.3516 | 0.2024 | 0.5889 | 0.5854 | 0.5714 | 0.5352 | 0.4760 |
| 0.3654 | 1200 | 0.3344 | 0.2033 | 0.5860 | 0.5832 | 0.5687 | 0.5339 | 0.4764 |
| 0.3959 | 1300 | 0.3161 | 0.1907 | 0.5920 | 0.5898 | 0.5718 | 0.5369 | 0.4756 |
| 0.4263 | 1400 | 0.3094 | 0.1905 | 0.5948 | 0.5915 | 0.5723 | 0.5374 | 0.4774 |
| 0.4568 | 1500 | 0.2981 | 0.1859 | 0.5924 | 0.5919 | 0.5736 | 0.5370 | 0.4755 |
| 0.4872 | 1600 | 0.3332 | 0.1860 | 0.5877 | 0.5881 | 0.5697 | 0.5361 | 0.4760 |
| 0.5177 | 1700 | 0.259 | 0.1877 | 0.5811 | 0.5820 | 0.5683 | 0.5343 | 0.4779 |
| 0.5481 | 1800 | 0.282 | 0.1924 | 0.5788 | 0.5811 | 0.5664 | 0.5337 | 0.4804 |
| 0.5786 | 1900 | 0.2739 | 0.1943 | 0.5803 | 0.5803 | 0.5685 | 0.5383 | 0.4823 |
| 0.6090 | 2000 | 0.2049 | 0.1893 | 0.5856 | 0.5826 | 0.5680 | 0.5380 | 0.4794 |
| 0.6395 | 2100 | 0.3545 | 0.1780 | 0.5920 | 0.5885 | 0.5717 | 0.5393 | 0.4743 |
| 0.6699 | 2200 | 0.3008 | 0.1769 | 0.5919 | 0.5879 | 0.5732 | 0.5392 | 0.4755 |
| 0.7004 | 2300 | 0.3561 | 0.1764 | 0.5909 | 0.5883 | 0.5735 | 0.5392 | 0.4777 |
| 0.7308 | 2400 | 0.4883 | 0.1705 | 0.5977 | 0.5922 | 0.5777 | 0.5451 | 0.4797 |
| 0.7613 | 2500 | 0.235 | 0.1665 | 0.5966 | 0.5928 | 0.5799 | 0.5434 | 0.4805 |
| 0.7917 | 2600 | 0.3415 | 0.1636 | 0.5960 | 0.5910 | 0.5780 | 0.5444 | 0.4815 |
| 0.8222 | 2700 | 0.2424 | 0.1637 | 0.5936 | 0.5917 | 0.5758 | 0.5455 | 0.4821 |
| 0.8526 | 2800 | 0.1937 | 0.1635 | 0.5961 | 0.5896 | 0.5790 | 0.5446 | 0.4841 |
| 0.8831 | 2900 | 0.1986 | 0.1620 | 0.5922 | 0.5884 | 0.5770 | 0.5428 | 0.4834 |
| 0.9135 | 3000 | 0.2009 | 0.1587 | 0.5963 | 0.5921 | 0.5793 | 0.5438 | 0.4820 |
| 0.9440 | 3100 | 0.221 | 0.1568 | 0.5964 | 0.5945 | 0.5810 | 0.5465 | 0.4824 |
| 0.9744 | 3200 | 0.1847 | 0.1592 | 0.5933 | 0.5913 | 0.5766 | 0.5440 | 0.4808 |
| 1.0049 | 3300 | 0.224 | 0.1629 | 0.5906 | 0.5882 | 0.5746 | 0.5410 | 0.4816 |
| 1.0353 | 3400 | 0.3356 | 0.1624 | 0.5884 | 0.5870 | 0.5728 | 0.5412 | 0.4795 |
| 1.0658 | 3500 | 0.2286 | 0.1624 | 0.5891 | 0.5864 | 0.5750 | 0.5419 | 0.4799 |
| 1.0962 | 3600 | 0.2176 | 0.1591 | 0.5933 | 0.5896 | 0.5772 | 0.5429 | 0.4824 |
| 1.1267 | 3700 | 0.1376 | 0.1592 | 0.5923 | 0.5884 | 0.5733 | 0.5415 | 0.4814 |
| 1.1571 | 3800 | 0.1222 | 0.1593 | 0.5918 | 0.5895 | 0.5737 | 0.5423 | 0.4828 |
| 1.1876 | 3900 | 0.2303 | 0.1600 | 0.5919 | 0.5847 | 0.5722 | 0.5423 | 0.4827 |
| 1.2180 | 4000 | 0.1984 | 0.1590 | 0.5920 | 0.5867 | 0.5742 | 0.5437 | 0.4858 |
| 1.2485 | 4100 | 0.1488 | 0.1596 | 0.5910 | 0.5850 | 0.5734 | 0.5402 | 0.4867 |
| 1.2789 | 4200 | 0.188 | 0.1597 | 0.5903 | 0.5843 | 0.5727 | 0.5401 | 0.4839 |
| 1.3094 | 4300 | 0.1507 | 0.1572 | 0.5884 | 0.5836 | 0.5717 | 0.5401 | 0.4848 |
| 1.3398 | 4400 | 0.2171 | 0.1585 | 0.5874 | 0.5833 | 0.5707 | 0.5408 | 0.4832 |
| 1.3703 | 4500 | 0.1938 | 0.1584 | 0.5885 | 0.5836 | 0.5706 | 0.5400 | 0.4836 |
| 1.4007 | 4600 | 0.1793 | 0.1566 | 0.5875 | 0.5834 | 0.5720 | 0.5409 | 0.4813 |
| 1.4312 | 4700 | 0.2104 | 0.1557 | 0.5898 | 0.5844 | 0.5727 | 0.5423 | 0.4815 |
| 1.4616 | 4800 | 0.1473 | 0.1562 | 0.5889 | 0.5854 | 0.5705 | 0.5413 | 0.4830 |
| 1.4921 | 4900 | 0.2356 | 0.1559 | 0.5878 | 0.5836 | 0.5708 | 0.5415 | 0.4834 |
| 1.5225 | 5000 | 0.1418 | 0.1565 | 0.5861 | 0.5835 | 0.5688 | 0.5413 | 0.4819 |
| 1.5530 | 5100 | 0.176 | 0.1572 | 0.5865 | 0.5820 | 0.5686 | 0.5407 | 0.4824 |
| 1.5834 | 5200 | 0.1911 | 0.1574 | 0.5859 | 0.5825 | 0.5688 | 0.5420 | 0.4824 |
| 1.6139 | 5300 | 0.1382 | 0.1562 | 0.5870 | 0.5826 | 0.5697 | 0.5423 | 0.4841 |
| 1.6443 | 5400 | 0.1825 | 0.1528 | 0.5880 | 0.5851 | 0.5714 | 0.5433 | 0.4830 |
| 1.6748 | 5500 | 0.2709 | 0.1524 | 0.5897 | 0.5858 | 0.5716 | 0.5430 | 0.4831 |
| 1.7052 | 5600 | 0.1992 | 0.1523 | 0.5900 | 0.5859 | 0.5727 | 0.5435 | 0.4827 |
| 1.7357 | 5700 | 0.326 | 0.1506 | 0.5910 | 0.5873 | 0.5736 | 0.5456 | 0.4842 |
| 1.7661 | 5800 | 0.1698 | 0.1495 | 0.5907 | 0.5865 | 0.5739 | 0.5443 | 0.4842 |
| 1.7966 | 5900 | 0.2013 | 0.1489 | 0.5916 | 0.5889 | 0.5738 | 0.5457 | 0.4826 |
| 1.8270 | 6000 | 0.1371 | 0.1484 | 0.5912 | 0.5883 | 0.5739 | 0.5454 | 0.4840 |
| 1.8575 | 6100 | 0.1351 | 0.1483 | 0.5917 | 0.5886 | 0.5735 | 0.5456 | 0.4844 |
| 1.8879 | 6200 | 0.1678 | 0.1486 | 0.5925 | 0.5878 | 0.5733 | 0.5450 | 0.4840 |
| 1.9184 | 6300 | 0.1154 | 0.1483 | 0.5915 | 0.5874 | 0.5742 | 0.5461 | 0.4847 |
| 1.9488 | 6400 | 0.1576 | 0.1482 | 0.5913 | 0.5880 | 0.5743 | 0.5469 | 0.4833 |
| 1.9793 | 6500 | 0.1609 | 0.1478 | 0.5921 | 0.5880 | 0.5736 | 0.5466 | 0.4843 |
### Framework Versions
- Python: 3.11.11
- Sentence Transformers: 3.3.1
- Transformers: 4.49.0
- PyTorch: 2.5.1
- Accelerate: 1.2.1
- Datasets: 3.2.0
- Tokenizers: 0.21.0
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MatryoshkaLoss
```bibtex
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | [
"TEXT_CLASSIFICATION"
]
| [
"CHIA"
]
| Non_BioNLP |
# SentenceTransformer based on Alibaba-NLP/gte-multilingual-base
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [Alibaba-NLP/gte-multilingual-base](https://huggingface.co/Alibaba-NLP/gte-multilingual-base) on the csv dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [Alibaba-NLP/gte-multilingual-base](https://huggingface.co/Alibaba-NLP/gte-multilingual-base) <!-- at revision ca1791e0bcc104f6db161f27de1340241b13c5a4 -->
- **Maximum Sequence Length:** 8192 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- csv
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: NewModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("minhtuan7akp/gte-base-vietnamese-finetune-matryoshka")
# Run inference
sentences = [
'Gia Cát Lượng đã giúp ai trong việc quản lý nước Thục?',
'phải trông coi mọi việc, giúp Thành Vương đến lúc trưởng thành. \r\n4\r\n Hoắc Quang giữ chức Đại tư mã tướng quân, phò Hán Chiêu Đế lúc lên ngôi mới 9 tuổi. \r\n5\r\n Gia Cát Lượng tức Khổng Minh, là thừa tướng của Chiêu Đế Lưu Bị nước Thục đời Tam Quốc. Lưu Bị chết, con là Lưu Thiện nối \r\nngôi, tức Thục Hậu chúa, mọi việc nước, việc quân đều phải trông cậy vào Gia Cát Lượng. \r\n6\r\n Tô Hiến Thành là Thái úy triều Lý Cao Tông, nhận di mệnh Cao Tông phò vua nhỏ là Long Cán lên nối ngôi mới 3 tuổi. \r\n7\r\n Tứ phụ: nghĩa là bốn viên đại thần giúp vua khi mới lên ngôi. \r\n8\r\n Chỉ Thuận Tông. \r\n9\r\n Xích chủy: nghĩa là mõm đỏ, miệng đỏ, hay đỏ mỏ. Xích chủy hầu là loài đỏ mỏ ám chỉ Lê Quý Ly. \r\n10 Bạch kê: nghĩa là gà trắng. Nghệ Tông sinh năm Tân Dậu, tức năm gà. Tân thuộc hành kim, loài kim sắc trắng. Vì thế "bạch kê" \r\nám chỉ Nghệ Tông. \r\n11 Chữ vương? ở trong lòng chữ khẩu? là chữ "quốc"?. \r\n12 Theo tục nhà Trần, hằng năm vào ngày mồng 4 tháng 4, vua hội họp bề tôi làm lễ tuyên thệ ở đền Đồng Cổ. (Xem bản kỷ, quyển \r\n5, Kiến Trung năm thứ 3, 1277). \r\n13 Chỉ Quý Ly. \r\n288 Đại Việt Sử Ký Toàn Thư - Bản Kỷ - Quyển VIII \r\nQuý Ly bỏ mũ, rập đầu khóc lóc từ tạ, chỉ trời vạch đất thề rằng: \r\n"Nếu thần không biết dốc lòng trung, hết sức giúp Quan gia để truyền đến con cháu về sau thì \r\ntrời sẽ ghét bỏ thần". \r\nQuý Ly lại nói: "Lúc Linh Đức Vương làm điều thất đức, nếu không nhờ oai linh bệ hạ thì thần đã',
'Tây. Ngoài cơ sờ đúc súng cũ của tiên triều, năm 1825 vua Minh \r\nMệnh mờ thêm sáu xưởng nữa. vốn cần cù và ham học hỏi sáng \r\ntạo, những người thợ quân giới đã được "thứ súng tay nạp thuốc nổ \r\nmạnh theo kiểu Tây dương". Vào những năm cuối triều Minh \r\nM ệnh, họ đã đúc 15 cỗ đại pháo X ung tiêu băng đồng và hai cỗ \r\nsúng lớn Chấn hải, loại đại pháo lợi hại trong thủy chiến phương \r\nTây. Sau đó, lại xuất xưởng tiếp 30 cỗ Chấn hải. Năm 1829, quản \r\nkho Hải Dương là Tôn Thất Thiện cùng với 100 lính Chấn cơ chế \r\nra cối gỗ chạy bàng sức nước ở khe suối để giã, luyện thuốc súng. \r\nDụng cụ này là xe "Thủy hỏa ký tế", và những năm sau được phổ \r\ncập trong quân ngũ. Từ vũ khí phương Tây, người Đại Nam đã tự \r\ntìm hiểu từng chi tiết để chế tạo thước đo ngắm bắn, thước kiểm tra \r\nthuốc súng. Trong bảy năm ờ ngôi, vua Thiệu Trị đúc 9 cỗ súng \r\nbàng đồng hiệu là "Thần uy phục viễn đại tướng quân", cỗ to nhất \r\nlà 10.706 cân, cỗ nhỏ nhất là 10.222 cân, tổng cộng là 93.829 cân.\r\n649\r\nLỊCH SỬ VIỆT NAM - TẬP 5\r\nVà ba cỗ súng hiệu "Bảo Đại định công an dân hòa chúng thượng \r\ntướng quân", mỗi cỗ trên 14.500 cân, tổng cộng là 43.620 cân1.\r\nĐe tạo điều kiện cho quân thủy học tập, bộ Công cấp cho họ la \r\nbàn, thước đo nước, đồng hồ cát xem giờ của phương Tây. v ề khoa \r\nmục bắn súng thì lính thủy phải tập bắn súng điểu sang và đại bác. \r\nMinh Mệnh yêu cầu Hiệp biện Đại học sĩ lãnh Thượng thư bộ Binh \r\nTrương Đăng Quế đọc kỹ các sách và bản đồ thủy chiến "Tây',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Information Retrieval
* Datasets: `gte_multilingual_base_768`, `gte_multilingual_base_512`, `gte_multilingual_base_256`, `gte_multilingual_base_128` and `gte_multilingual_base_64`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | gte_multilingual_base_768 | gte_multilingual_base_512 | gte_multilingual_base_256 | gte_multilingual_base_128 | gte_multilingual_base_64 |
|:--------------------|:--------------------------|:--------------------------|:--------------------------|:--------------------------|:-------------------------|
| cosine_accuracy@1 | 0.3973 | 0.3877 | 0.3772 | 0.3612 | 0.3082 |
| cosine_accuracy@3 | 0.6333 | 0.6311 | 0.6146 | 0.5826 | 0.5146 |
| cosine_accuracy@5 | 0.7132 | 0.7096 | 0.6872 | 0.6502 | 0.5863 |
| cosine_accuracy@10 | 0.7817 | 0.7822 | 0.7662 | 0.7342 | 0.6621 |
| cosine_precision@1 | 0.3973 | 0.3877 | 0.3772 | 0.3612 | 0.3082 |
| cosine_precision@3 | 0.2111 | 0.2104 | 0.2049 | 0.1942 | 0.1715 |
| cosine_precision@5 | 0.1426 | 0.1419 | 0.1374 | 0.13 | 0.1173 |
| cosine_precision@10 | 0.0782 | 0.0782 | 0.0766 | 0.0734 | 0.0662 |
| cosine_recall@1 | 0.3973 | 0.3877 | 0.3772 | 0.3612 | 0.3082 |
| cosine_recall@3 | 0.6333 | 0.6311 | 0.6146 | 0.5826 | 0.5146 |
| cosine_recall@5 | 0.7132 | 0.7096 | 0.6872 | 0.6502 | 0.5863 |
| cosine_recall@10 | 0.7817 | 0.7822 | 0.7662 | 0.7342 | 0.6621 |
| **cosine_ndcg@10** | **0.5921** | **0.588** | **0.5736** | **0.5466** | **0.4843** |
| cosine_mrr@10 | 0.531 | 0.5253 | 0.5117 | 0.4866 | 0.4275 |
| cosine_map@100 | 0.538 | 0.5319 | 0.5189 | 0.4943 | 0.4371 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### csv
* Dataset: csv
* Size: 21,892 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 9 tokens</li><li>mean: 26.95 tokens</li><li>max: 103 tokens</li></ul> | <ul><li>min: 25 tokens</li><li>mean: 373.94 tokens</li><li>max: 596 tokens</li></ul> |
* Samples:
| anchor | positive |
|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Tính chất kiến trúc của đình làng triều Mạc được thể hiện qua những đặc điểm gì, như số gian, hình dạng, nội thất và cách bố trí không gian trong công trình?</code> | <code>Đình làng là công trình kiến trúc công cộng được dựng nên
<br>băng sự đóng góp của cải và công sức của cả cộng đồng làng xã.
<br>Ngoài chức năng là trụ sở hành chính của cả làng, ngôi đình còn là
<br>trung tâm sinh hoạt văn hóa làng xã, là nơi diễn ra các nghi lễ trọng
<br>đại trong dịp tế lễ thần Thành hoàng làng và tô chức hội hè hăng
<br>năm. Có thê nói, ngôi đình làng là nơi hội tụ sức mạnh của cả cộng
<br>đồng và là biểu trưng đặc sắc nhất của văn hóa làng xã.
<br>
<br>Trong các ngôi đình triều Mạc, Thân thành hoàng có lý lịch
<br>xuất thân khá phong phú. Tản Viên sơn thánh là vị thần có ảnh
<br>hưởng lớn ở xứ Đoài được thờ phụng ở đình Tây Đăng, Thanh Lũng
<br>và nhiều làng xã khác. Thần Cao Sơn, Quý Minh tương truyền là
<br>tướng tâm phúc của Hùng Vương được thờ ở đình làng Lỗ Hạnh.
<br>Dân làng Lỗ Hạnh còn thờ cả Phương Dung công chúa... Từ thế
<br>kỷ XYVI và các thế kỷ tiếp sau, Thần thành hoàng làng trở thành
<br>vị vua tỉnh thần ở các làng xã, tín ngưỡng thờ cúng Thân thành
<br>hoàng càng trở nên phong phú thê hiện qua lễ...</code> |
| <code>Nguyễn Khắc Nhu có vai trò gì trong khởi nghĩa toàn khu vực miền núi Bắc Kỳ của Việt Nam Quốc dân Đảng vào năm 1930?</code> | <code>bị nổ do bất cẩn. Do đó công việc bị phát hiện. Hai người phụ trách
<br>cơ quan chế bom là Đỗ Cương và Quản Trác trốn thoát. Nhiều binh
<br>lính và dân thường bị bắt. Công việc bạo động của Xứ Nhu không
<br>thành. Đúng lúc này Việt Nam Quốc dân Đảng vừa thành lập, cử
<br>người tới mời Xứ Nhu và Việt Nam Dân quốc gia nhập Việt Nam
<br>Quốc dân Đảng. Hầu hết các đồng chí của Xứ Nhu trở thành đảng
<br>viên của Việt Nam Quốc dân Đảng ở vùng Bắc Ninh, Bắc Giang.
<br>Do đó, Việt Nam Quốc dân Đảng mạnh lên về số lượng1. Cùng với
<br>việc phát triển đảng viên ở Bẳc Ninh, Bắc Giang, Việt Nam Quốc
<br>dân Đảng còn thiết lập nhiều cơ sở ở các tỉnh Thái Bình, Hải Dương,
<br>1. Nguyễn Khắc Nhu tức Xứ Nhu (1882-1930), người làng Song Khê, huyện
<br>Yên Dũng, tinh Bắc Giang. Với lòng yêu nuớc và ý chí chống Pháp,
<br>ông dự tính thành lập một tổ chức hoạt động công khai nhăm đào tạo
<br>tài năng cho đất nước lấy tên là "Hội Quốc dân dục tài”. Việc này
<br>không thành công, ông lại lập tổ chức bí mật nhăm bạo động lật đổ ách
<br>áp b...</code> |
| <code>Giá gạo tháng 3-1950 ở Liên khu IV là bao nhiêu đồng/tạ và có chênh lệch gì so với giá gạo ở Liên khu III và Liên khu Việt Bắc?</code> | <code>ngày càng tăng nhanh, nhất là ở Việt Bắc. Giá gạo tăng mạnh
<br>nhất, giá thực phẩm cũng tăng dần theo giá gạo. Giá các mặt hàng
<br>kỹ nghệ tăng chậm hơn. Giá hàng ngoại hóa hầu như không tăng
<br>vỉ trong vùng Pháp chiếm đóng, hàng ngoại hóa tính bằng tiền
<br>Đông Dương không tăng, hom nữa nhân dân cũng ít tiêu thụ hàng
<br>ngoại hóa vì bị cấm.
<br>1. Viện Kinh tế học, Kinh tế Việt Nam từ Cách mạng Tháng Tám đến..., Sách
<br>đã dẫn, tr. 238.
<br>2. Chuơng trình và báo cáo của Bộ Kinh tế về tình hình hoạt động năm 1950.
<br>Trung tâm lưu trữ quốc gia in, phông Phủ Thủ tướng, Hồ sơ số 1914.
<br>488
<br>Chương VI. Việt Nam dân chủ cộng hòa xây dựng..
<br>Giá gạo trong những tháng đầu năm 1950 so với cuối năm 1949
<br>có thay đổi, Liên khu IV (Thanh Hóa) giá tăng lên 154%; Liên khu
<br>III (Hà Đông - Hà Nam) giá tăng lên 153%; Liên khu Việt Bắc
<br>(Thái Nguyên) giá tăng lên 800%.
<br>Giá gạo ở Thái Nguyên từ 1.625 đồng/tạ lên 13.000 đồng/tạ
<br>(tăng 800%); ờ Phú Thọ từ 2.650 đồng/tạ lên 7.500 đồng/tạ (tăng
<br>283%). Mặt khác, ...</code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Evaluation Dataset
#### csv
* Dataset: csv
* Size: 21,892 evaluation samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 10 tokens</li><li>mean: 26.56 tokens</li><li>max: 108 tokens</li></ul> | <ul><li>min: 24 tokens</li><li>mean: 369.01 tokens</li><li>max: 559 tokens</li></ul> |
* Samples:
| anchor | positive |
|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Nguyễn Hoàng đã thực hiện những hành động gì để dần dần tách khỏi sự ràng buộc của họ Trịnh sau khi trở lại Thuận Quảng vào năm 1600, và những hành động này đã ảnh hưởng như thế nào đến mối quan hệ giữa hai dòng họ?</code> | <code>thẳng đối với họ Nguyễn. Trịnh Tùng đã lấy danh nghĩa vua Lê sai
<br>sứ giả là Thiêm đô ngự sử Lê Nghĩa Trạch đem sắc vào phủ dụ
<br>Nguyễn Hoàng và vẫn cho ở lại trấn thủ, hằng năm nộp thuế như
<br>cũ. Cùng với sắc của vua Lê, Trịnh Tùng có gửi thư kèm theo
<br>Chương ĩ. Sự phân liệt Đàng Trong - Đàng Ngoài...
<br>1, Toàn thư. quyển 17, tập IV, Sđd, tr. 200.
<br>2, Đại Nam thực lục, Tiền biên, quyển 1, tập I, Sđd, tr. 34.
<br>3, Đại Nam thực lục, Tiển biên, quyển 1, tập I, Sđd, tr. 35.
<br>39
<br>LỊCH SỬ VIỆT NAM - TẬP 4
<br>"khuyên giữ việc thuế cống". Nguyễn Hoàng sai sứ giả đáp lễ tạ on
<br>vua Lê và gửi thư cho Trịnh Tùng hẹn kết nghĩa thông gia, đem con
<br>gái là Ngọc Tú gả cho Trịnh Tráng (con Trịnh Tùng) lấy danh
<br>nghĩa hôn nhân để duy trì mối quan hệ bề ngoài giao hảo giữa hai
<br>dòng họ vốn có sẵn một mối thù địch.
<br>- Chính sách cùa họ Nguyễn từ khi Nguyễn Hoàng trở lại
<br>Thuận Quảng
<br>Năm 1600, Nguyễn Hoàng ròi được khỏi đất Bẳc trở về Thuận
<br>Quảng bắt đầu thực hiện một chính sách cai trị mói, dần dần tác...</code> |
| <code>Báo cáo của Ủy ban Kháng chiến hành chính Hà Nội về hoạt động giáo dục bù nhìn và tình hình các giáo sư trường Chu Văn An có nội dung gì?</code> | <code>Tài liệu tham khảo
<br>21. Báo cáo sô' 2 BC/I ngày 12-11-1949 và Báo cáo sô' 463
<br>BC/DB ngày 25-12-1949 của Ty Công an H à Nội. Trung
<br>tâm Lưu trữ Quốc gia III, phông Phủ Thủ tướng, Hồ sơ
<br>SỐ921.
<br>28. Báo “Le song” ngày 11-2-1949. Trung tâm Lưu trữ Quốc
<br>gia III, phông Phủ Thủ tướng, Hồ sơ sô' 2002.
<br>29. Báo cáo của u ỷ ban Kháng chiến hành chính Hà Nội vê
<br>hoạt động giáo dục bù nhìn và tình hình các giáo sư
<br>trường Chu Văn An. Trung tâm Lưu trữ Quốc gia III,
<br>phông Phủ Thủ tướng, Hồ sơ số 979.
<br>30. Báo cáo của Tổng Giám đốc Việt N am Công an vụ sô'
<br>122/NCB3 ngày 1-4-1951. Trung tâm Lưu trữ Quốic gia
<br>III, phông Phủ Thủ tướng, Hồ sơ sô' 979.
<br>31. Báo cáo thành tích về cống tác công an trong 8 năm kháng
<br>chiến (1946-1954) của Bộ Công an. Trung tâm Lưu trữ
<br>Quốc gia III, phông Phủ Thủ tướng, Hồ sơ sô' 927.
<br>32. Báo cáo một năm kháng chiến (12-1946 đến 12-1947) của
<br>UBKCHC Khu 12. Trung tâm Lưu trữ Quốc gia III, phông
<br>Phủ Thủ tướng, Hồ sơ sô" 2000.
<br>33. Báo cáo thành tích quăn sự trong 8 n...</code> |
| <code>Đặc điểm dân số của nước ta ảnh hưởng đến các ngành dịch vụ như thế nào và đòi hỏi những ngành dịch vụ nào cần được ưu tiên phát triển trong quá trình đô thị hóa?</code> | <code>— Trong các thành phố lớn thường hình thành các trung tâm giao dịch,
<br>thương mại. Đó là nơi tập trung các ngân hàng, các văn phòng đại diện
<br>của các công ti, các siêu thị hay các tổ hợp thương mại, dịch vụ lớn...
<br>Ở các thành phố lớn trên thế giới, thường dễ nhận thấy các trung tâm
<br>thương mại này do sự tập trung các ngôi nhà cao tầng, chọc trời. Một
<br>thành phố có thể có trung tâm thương mại chính và một số trung tâm
<br>thương mại nhỏ hơn, kết quả của sự phát triển đô thị.
<br>
<br>— Ở nước ta, các thành phố, thị xã thường có khu hành chính (phân
<br>“đô”) và khu buôn bán, dịch vụ (phân “thị'). Ở Hà Nội, Thành phố
<br>Hồ Chí Minh các trung tâm giao dịch, thương mại của thành phố đang
<br>được hình thành rõ nét.
<br>
<br>CÂU HỎI VÀ BÀI TẬP
<br>
<br>174
<br>
<br>1. Cho biết đặc điểm dân số của nước ta (đông, tăng còn tương đối
<br>nhanh, mức sống đang nâng lên và đô thị hoá đang phát triển với
<br>tốc độ nhanh hơn) có ảnh hưởng đến các ngành dịch vụ như thế
<br>nào ? Các đặc điểm đó đòi hỏi những ngành dịch vụ nào cần được
<br>ưu tiê...</code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 6
- `per_device_eval_batch_size`: 6
- `learning_rate`: 3e-06
- `num_train_epochs`: 2
- `warmup_ratio`: 0.05
- `bf16`: True
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 6
- `per_device_eval_batch_size`: 6
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 3e-06
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 2
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.05
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: True
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | Validation Loss | gte_multilingual_base_768_cosine_ndcg@10 | gte_multilingual_base_512_cosine_ndcg@10 | gte_multilingual_base_256_cosine_ndcg@10 | gte_multilingual_base_128_cosine_ndcg@10 | gte_multilingual_base_64_cosine_ndcg@10 |
|:------:|:----:|:-------------:|:---------------:|:----------------------------------------:|:----------------------------------------:|:----------------------------------------:|:----------------------------------------:|:---------------------------------------:|
| 0.0305 | 100 | 1.1057 | 0.7163 | 0.5609 | 0.5532 | 0.5375 | 0.4939 | 0.4168 |
| 0.0609 | 200 | 0.7976 | 0.5554 | 0.5724 | 0.5696 | 0.5491 | 0.5068 | 0.4351 |
| 0.0914 | 300 | 0.6724 | 0.4082 | 0.5819 | 0.5778 | 0.5592 | 0.5177 | 0.4453 |
| 0.1218 | 400 | 0.4439 | 0.3058 | 0.5868 | 0.5832 | 0.5643 | 0.5231 | 0.4558 |
| 0.1523 | 500 | 0.3544 | 0.2573 | 0.5873 | 0.5836 | 0.5631 | 0.5264 | 0.4597 |
| 0.1827 | 600 | 0.3483 | 0.2358 | 0.5897 | 0.5856 | 0.5690 | 0.5309 | 0.4679 |
| 0.2132 | 700 | 0.4737 | 0.2248 | 0.5917 | 0.5883 | 0.5767 | 0.5350 | 0.4747 |
| 0.2436 | 800 | 0.3216 | 0.2193 | 0.5899 | 0.5853 | 0.5712 | 0.5330 | 0.4734 |
| 0.2741 | 900 | 0.3239 | 0.2109 | 0.5918 | 0.5883 | 0.5719 | 0.5344 | 0.4712 |
| 0.3045 | 1000 | 0.3111 | 0.2065 | 0.5882 | 0.5856 | 0.5708 | 0.5331 | 0.4751 |
| 0.3350 | 1100 | 0.3516 | 0.2024 | 0.5889 | 0.5854 | 0.5714 | 0.5352 | 0.4760 |
| 0.3654 | 1200 | 0.3344 | 0.2033 | 0.5860 | 0.5832 | 0.5687 | 0.5339 | 0.4764 |
| 0.3959 | 1300 | 0.3161 | 0.1907 | 0.5920 | 0.5898 | 0.5718 | 0.5369 | 0.4756 |
| 0.4263 | 1400 | 0.3094 | 0.1905 | 0.5948 | 0.5915 | 0.5723 | 0.5374 | 0.4774 |
| 0.4568 | 1500 | 0.2981 | 0.1859 | 0.5924 | 0.5919 | 0.5736 | 0.5370 | 0.4755 |
| 0.4872 | 1600 | 0.3332 | 0.1860 | 0.5877 | 0.5881 | 0.5697 | 0.5361 | 0.4760 |
| 0.5177 | 1700 | 0.259 | 0.1877 | 0.5811 | 0.5820 | 0.5683 | 0.5343 | 0.4779 |
| 0.5481 | 1800 | 0.282 | 0.1924 | 0.5788 | 0.5811 | 0.5664 | 0.5337 | 0.4804 |
| 0.5786 | 1900 | 0.2739 | 0.1943 | 0.5803 | 0.5803 | 0.5685 | 0.5383 | 0.4823 |
| 0.6090 | 2000 | 0.2049 | 0.1893 | 0.5856 | 0.5826 | 0.5680 | 0.5380 | 0.4794 |
| 0.6395 | 2100 | 0.3545 | 0.1780 | 0.5920 | 0.5885 | 0.5717 | 0.5393 | 0.4743 |
| 0.6699 | 2200 | 0.3008 | 0.1769 | 0.5919 | 0.5879 | 0.5732 | 0.5392 | 0.4755 |
| 0.7004 | 2300 | 0.3561 | 0.1764 | 0.5909 | 0.5883 | 0.5735 | 0.5392 | 0.4777 |
| 0.7308 | 2400 | 0.4883 | 0.1705 | 0.5977 | 0.5922 | 0.5777 | 0.5451 | 0.4797 |
| 0.7613 | 2500 | 0.235 | 0.1665 | 0.5966 | 0.5928 | 0.5799 | 0.5434 | 0.4805 |
| 0.7917 | 2600 | 0.3415 | 0.1636 | 0.5960 | 0.5910 | 0.5780 | 0.5444 | 0.4815 |
| 0.8222 | 2700 | 0.2424 | 0.1637 | 0.5936 | 0.5917 | 0.5758 | 0.5455 | 0.4821 |
| 0.8526 | 2800 | 0.1937 | 0.1635 | 0.5961 | 0.5896 | 0.5790 | 0.5446 | 0.4841 |
| 0.8831 | 2900 | 0.1986 | 0.1620 | 0.5922 | 0.5884 | 0.5770 | 0.5428 | 0.4834 |
| 0.9135 | 3000 | 0.2009 | 0.1587 | 0.5963 | 0.5921 | 0.5793 | 0.5438 | 0.4820 |
| 0.9440 | 3100 | 0.221 | 0.1568 | 0.5964 | 0.5945 | 0.5810 | 0.5465 | 0.4824 |
| 0.9744 | 3200 | 0.1847 | 0.1592 | 0.5933 | 0.5913 | 0.5766 | 0.5440 | 0.4808 |
| 1.0049 | 3300 | 0.224 | 0.1629 | 0.5906 | 0.5882 | 0.5746 | 0.5410 | 0.4816 |
| 1.0353 | 3400 | 0.3356 | 0.1624 | 0.5884 | 0.5870 | 0.5728 | 0.5412 | 0.4795 |
| 1.0658 | 3500 | 0.2286 | 0.1624 | 0.5891 | 0.5864 | 0.5750 | 0.5419 | 0.4799 |
| 1.0962 | 3600 | 0.2176 | 0.1591 | 0.5933 | 0.5896 | 0.5772 | 0.5429 | 0.4824 |
| 1.1267 | 3700 | 0.1376 | 0.1592 | 0.5923 | 0.5884 | 0.5733 | 0.5415 | 0.4814 |
| 1.1571 | 3800 | 0.1222 | 0.1593 | 0.5918 | 0.5895 | 0.5737 | 0.5423 | 0.4828 |
| 1.1876 | 3900 | 0.2303 | 0.1600 | 0.5919 | 0.5847 | 0.5722 | 0.5423 | 0.4827 |
| 1.2180 | 4000 | 0.1984 | 0.1590 | 0.5920 | 0.5867 | 0.5742 | 0.5437 | 0.4858 |
| 1.2485 | 4100 | 0.1488 | 0.1596 | 0.5910 | 0.5850 | 0.5734 | 0.5402 | 0.4867 |
| 1.2789 | 4200 | 0.188 | 0.1597 | 0.5903 | 0.5843 | 0.5727 | 0.5401 | 0.4839 |
| 1.3094 | 4300 | 0.1507 | 0.1572 | 0.5884 | 0.5836 | 0.5717 | 0.5401 | 0.4848 |
| 1.3398 | 4400 | 0.2171 | 0.1585 | 0.5874 | 0.5833 | 0.5707 | 0.5408 | 0.4832 |
| 1.3703 | 4500 | 0.1938 | 0.1584 | 0.5885 | 0.5836 | 0.5706 | 0.5400 | 0.4836 |
| 1.4007 | 4600 | 0.1793 | 0.1566 | 0.5875 | 0.5834 | 0.5720 | 0.5409 | 0.4813 |
| 1.4312 | 4700 | 0.2104 | 0.1557 | 0.5898 | 0.5844 | 0.5727 | 0.5423 | 0.4815 |
| 1.4616 | 4800 | 0.1473 | 0.1562 | 0.5889 | 0.5854 | 0.5705 | 0.5413 | 0.4830 |
| 1.4921 | 4900 | 0.2356 | 0.1559 | 0.5878 | 0.5836 | 0.5708 | 0.5415 | 0.4834 |
| 1.5225 | 5000 | 0.1418 | 0.1565 | 0.5861 | 0.5835 | 0.5688 | 0.5413 | 0.4819 |
| 1.5530 | 5100 | 0.176 | 0.1572 | 0.5865 | 0.5820 | 0.5686 | 0.5407 | 0.4824 |
| 1.5834 | 5200 | 0.1911 | 0.1574 | 0.5859 | 0.5825 | 0.5688 | 0.5420 | 0.4824 |
| 1.6139 | 5300 | 0.1382 | 0.1562 | 0.5870 | 0.5826 | 0.5697 | 0.5423 | 0.4841 |
| 1.6443 | 5400 | 0.1825 | 0.1528 | 0.5880 | 0.5851 | 0.5714 | 0.5433 | 0.4830 |
| 1.6748 | 5500 | 0.2709 | 0.1524 | 0.5897 | 0.5858 | 0.5716 | 0.5430 | 0.4831 |
| 1.7052 | 5600 | 0.1992 | 0.1523 | 0.5900 | 0.5859 | 0.5727 | 0.5435 | 0.4827 |
| 1.7357 | 5700 | 0.326 | 0.1506 | 0.5910 | 0.5873 | 0.5736 | 0.5456 | 0.4842 |
| 1.7661 | 5800 | 0.1698 | 0.1495 | 0.5907 | 0.5865 | 0.5739 | 0.5443 | 0.4842 |
| 1.7966 | 5900 | 0.2013 | 0.1489 | 0.5916 | 0.5889 | 0.5738 | 0.5457 | 0.4826 |
| 1.8270 | 6000 | 0.1371 | 0.1484 | 0.5912 | 0.5883 | 0.5739 | 0.5454 | 0.4840 |
| 1.8575 | 6100 | 0.1351 | 0.1483 | 0.5917 | 0.5886 | 0.5735 | 0.5456 | 0.4844 |
| 1.8879 | 6200 | 0.1678 | 0.1486 | 0.5925 | 0.5878 | 0.5733 | 0.5450 | 0.4840 |
| 1.9184 | 6300 | 0.1154 | 0.1483 | 0.5915 | 0.5874 | 0.5742 | 0.5461 | 0.4847 |
| 1.9488 | 6400 | 0.1576 | 0.1482 | 0.5913 | 0.5880 | 0.5743 | 0.5469 | 0.4833 |
| 1.9793 | 6500 | 0.1609 | 0.1478 | 0.5921 | 0.5880 | 0.5736 | 0.5466 | 0.4843 |
### Framework Versions
- Python: 3.11.11
- Sentence Transformers: 3.3.1
- Transformers: 4.49.0
- PyTorch: 2.5.1
- Accelerate: 1.2.1
- Datasets: 3.2.0
- Tokenizers: 0.21.0
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MatryoshkaLoss
```bibtex
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | {"base_model": "Alibaba-NLP/gte-multilingual-base", "library_name": "sentence-transformers", "metrics": ["cosine_accuracy@1", "cosine_accuracy@3", "cosine_accuracy@5", "cosine_accuracy@10", "cosine_precision@1", "cosine_precision@3", "cosine_precision@5", "cosine_precision@10", "cosine_recall@1", "cosine_recall@3", "cosine_recall@5", "cosine_recall@10", "cosine_ndcg@10", "cosine_mrr@10", "cosine_map@100"], "pipeline_tag": "sentence-similarity", "tags": ["sentence-transformers", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:21892", "loss:MatryoshkaLoss", "loss:MultipleNegativesRankingLoss"], "widget": [{"source_sentence": "Sự khác biệt giữa các thời đại trong nghệ thuật trang trí rồng được thể hiện như thế nào qua các thời Hùng Vương, Lý, Trần, Hồ, Lê, Mạc, Nguyễn?", "sentences": ["Tài liệu tham khảo\r\n323. Nguyễn Quang Ngọc, “Mấy nhận xét về kết cấu kinh tế của \r\nmột số làng thương nghiệp ờ vùng đồng bằng Bắc Bộ thế kỳ \r\nXVIII-XIX”, Tạp chí Nghiên cứu Lịch sứ, số 5 (218), 1984.\r\n324. Nguyễn Quang Ngọc, Phan Đại Doãn, “Mấy ý kiến về hoạt \r\nđộng thương nghiệp ở nông thôn đồng bằng Bắc Bộ thế kỷ \r\nXVIII-XIX (hiện tượng và bản chất)”, Tạp chí Nghiên cứu\r\nLịch sử, số 5 (224), 1985.\r\n325. Nguyễn Quang Ngọc, “Thêm vài ý kiến về Tam Điệp”, Tạp \r\nchí Nghiên cứu Lịch sử, số 1 (244), 1989.\r\n326. Nguyễn Quang Ngọc, về một số làng buôn ở Đồng bàng Bắc \r\nBộ thế kỳ XVIII-XIX, Hội Sừ học Việt Nam, 1993.\r\n327. Nguyễn Quang Ngọc, Vũ Văn Quân, “Tư liệu về nguồn gốc \r\nchức năng và hoạt động cùa đội Hoàng Sa”, Tạp chí Khoa\r\nhọc xã hội, Đại học Quốc gia, t.XIV, số 3, 1998, ư. 10-20.\r\n328. Nguyễn Quang Ngọc, “Bảo vệ chủ quyền ưên Biển Đông: \r\nmột hoạt động nổi bật của vương triều Tây Sơn”, Tạp chí \r\nLịch sử quân sự, số 1, 1999, tr. 15-18.\r\n329. Nguyễn Quang Ngọc (Chủ biên), Tiến trình lịch sứ Việt Nam,\r\nNxb. Giáo dục, Hà Nội, 2001.\r\n330. Nguyền Quân, Phan cẩm Thượng, Mỹ thuật cùa người Việt,\r\nNxb. Mỹ thuật. Hà Nội. 1989.\r\n331. Nguyễn Tài Thư (Chủ biên), Lịch sử tư tưởng Việt Nam, 2\r\ntập, Nxb. Khoa học xã hội, Hà Nội, 1993.\r\n332. Nguyễn Tài Thư, Nho học và Nho học ớ Việt Nam: Một số lý\r\nluận và thực tiễn, Nxb. Khoa học xã hội, Hà Nội, 1997.\r\n333. Nguyễn Tưòmg Phượng, Binh chế Việt Nam qua các thời đại,\r\nNgày Mai, 1950.", "Ba Thục, Kinh Sở, Ngô Việt…). Kết thúc cuộc \"Hán Sở tranh hùng\", nhà Hán\r\nđã thống nhất đất nước Trung Hoa từ bắc xuống nam (tiền bắc hậu nam) và phát\r\ntriển đất nước theo một trật tự ngược lại: tiền nam hậu bắc\".\r\nCó thể hình dung cơ cấu của văn hóa Trung Hoa như sau: \r\nVĂN HOÁ\r\nTRUNG\r\nHOA\r\n=\r\nVăn hoá lưu vực sông Hoàng Hà\r\n+\r\nVăn hoá nông\r\nnghiệp lúa nước\r\nĐông Nam Á\r\nVăn hoá du\r\nmục Tây Bắc +\r\nVăn hoá nông\r\nnghiệp khối Trung\r\nNguyên\r\nMối liên hệ và sự tác động qua lại giữa văn hóa Việt Nam với Trung Hoa,\r\ngiữa văn hóa phương Bắc cổ đại với văn hóa phương Nam cổ đại (trong đó có\r\nvăn hóa Nam – Á - Bách Việt) có thể trình bày trong bảng 1.5.\r\nVĂN HOÁ\r\nP.BẮC CỔ ĐẠI\r\nVĂN HOÁ PHƯƠNG NAM (= Đ.N.Á cổ đại)\r\nVăn hoá Nam-Á (Bách Việt)\r\nVăn hóa vùng lưu\r\nvực sông Hoàng\r\nHà\r\nVăn hóa vùng lưu\r\nvực sông Dương\r\nTử\r\nVăn hóa vùng lưu\r\nvực s. Hồng, s.\r\nMã\r\nVăn hóa miền\r\nTrung và đồng\r\nbằng s. Mê Kông\r\nVĂN HOÁ TRUNG HOA VĂN HOÁ VIỆT NAM\r\nBảng 1.5: Quan hệ cội nguồn giữa văn hóa Việt Nam và Trung Hoa\r\nBài 3: TIẾN TRÌNH VĂN HÓA VIỆT NAM\r\nTiến trình văn hóa Việt Nam có thể chia thành 6 giai đoạn: văn hóa tiền\r\nsử, văn hóa Văn Lang - Âu Lạc, văn hóa thời chống Bắc thuộc, văn hóa Đại\r\nViệt, văn hóa Đại Nam và văn hóa hiện đại. Sáu giai đoạn này tạo thành ba lớp:\r\nlớp văn hóa bản địa, lớp văn hóa giao lưu với Trung Hoa và khu vực, lớp văn\r\nhóa giao lưu với phương Tây.\r\n3.1. Lớp văn hóa bản địa\r\n28\r\nDownloaded by Tu?n ?ào Minh ([email protected])\r\nlOMoARcPSD|49704028", "trái), và hình bán nguyệt (đôi dưới, phải). Trước mắt ta là sự hòa hợp tuyệt vời\r\ncủa cái động (vật nhau) trong thế tĩnh của ba hình hình học với những cạnh đáy\r\nvững vàng cho thấy sự ngang sức ngang tài của các chàng trai; sự vận động liên\r\ntục của cơ bắp như dừng lại. Hai người chờ vật được khuôn lại trong hai hình\r\nchữ nhật đứng tạo nên cảm giác co ro bất tận trong cái rét của lễ hội đầu xuân.\r\n4.1.3. Thủ pháp mô hình hóa đã tạo nên một nền nghệ thuật trang trí và\r\nnhiều mô hình mang tính triết lí sâu sắc.\r\nBộ Tứ Linh (Hình 4.20a) với long (rồng) biểu trưng cho uy là nam tính; li\r\n(= long mã) hoặc lân (kì lân, con vật tưởng tượng đầu sư tử, mình nai, đuôi trâu,\r\n131\r\nDownloaded by Tu?n ?ào Minh ([email protected])\r\nlOMoARcPSD|49704028\r\năn cỏ, rất hiền lành - hình 4.20b) biểu trưng cho ước vọng thái bình, quy (rùa)\r\nhiểu tượng cho sự sống lâu và phượng (phụng) biểu tượng cho nữ tính. Rồng -\r\nPhượng biểu tượng cho hạnh phúc lứa đôi (ở Trung Hoa hiên tượng này là\r\n“loan-phượng”: loan là con đực, phượng là con cái). Đồ án trang trí RỒNG phổ\r\nbiến đến mức phản ánh những đặc trưng cửa từng thời đại. Rồng thời Hùng\r\nvương, thời Lí, Trần, Hồ, Lê, Mạc, Nguyễn – mỗi thời có những nét đặc thù\r\nriêng tương ứng với thời đại của mình.\r\nTứ linh cộng thêm ngư-phúc-hạc-hổ thì thành BÁT VẬT. Ngư (Cá) gắn\r\nvới truyền thuyết \"cá hóa rồng\" biểu tượng cho sự thành đạt. Chữ phúc là “sự tốt\r\nlành, may mắn” đồng âm và viết gần giống với chữ bức nghĩa là \"con dơi\", vì"]}, {"source_sentence": "Nhiệm vụ quan trọng nhất của các nước công nghiệp chủ nghĩa châu Âu và Nhật Bản sau chiến tranh thế giới thứ hai là gì?", "sentences": ["Dupuis phái tự mình hành động. Tháng 10-1872, Dupuis đi Hương \r\nCảng và Thượng Hải mua pháo thuyền và đạn dược, mộ quân lính,\r\n1. Đó là các cuộc thám hiểm cùa phái đoàn Doudard de Lagrée và Francis \r\nGamier vào những năm từ 1866 đến 1870.\r\n2. Nguyễn Phan Quang (1949), Việt Nam thế ky XIX (1802-1884), Nxb. \r\nThành phố Hồ Chí Minh, tr. 321.\r\n159\r\nLỊCH SỪ VIỆT NAM - TẬP 6\r\nrồi đến tháng 11 năm đó thì kéo nhau về Bắc Kỳ. Cùng lúc đó, bọn \r\nthực dân hiếu chiến ở Nam Kỳ cũng lợi dụng việc triều đình Huế \r\nyêu cầu đưa ra Bắc tiễu trừ giặc biển để phái tàu chiến ra tiếp tay \r\ncho Dupuis. Cậy có lực lượng mạnh, Dupuis buộc Kinh lược sứ Lê \r\nTuấn trong vòng hai tuần phải xin triều đình Huế cho phép hắn \r\nđược mượn đường đi lên Vân Nam. Nhung hạn 2 tuần chưa hết và \r\ngiấy phép cũng chưa có mà Dupuis đã nổ súng, rồi tự tiện kéo đoàn \r\ntàu vào Cửa cấm (Hải Phòng) ngược sông Hồng lên Hà Nội (ngày \r\n22-12-1872). Theo sử nhà Nguyễn thì ngày 2-12-1872, Dupuis “từ\r\nHài Dương đi đen Bắc Ninh, Hà Nội, các quan tình và quân thứ 2-\r\n3 lần biện bác ngăn trở không cho đi, nhưng chúng không nghe\r\nTrong khoảng thời gian từ năm 1872 đến năm 1873, Dupuis đã ỷ \r\nthế quân Pháp và triều đình nhà Thanh, trắng trợn xâm phạm chủ \r\nquyền Việt Nam, liên tiếp gây ra nhiều vụ khiêu khích, cướp phá \r\nđối với nhân dân dọc hai bờ sông, tấn công các đồn bốt của triều \r\nđình nhà Nguyễn.\r\nTrước hành động ngang ngược cùa Dupuis, quân dân Hà Nội \r\nmặc dù chưa có lệnh triều đình nhung vẫn tích cực đề phòng. Lệnh", "hội loài người nói chung hay cùa một quốc gia, một dân tộc nói \r\nriêng. Nghiên cứu lịch sử là nhằm tìm hiểu những sự kiện xảy ra \r\ntrong quá khứ để từ đó rút ra các bài học kinh nghiệm cho hiện tại \r\nvà tương lai. Nghiên cứu và biên soạn lịch sừ, vì vậy, trở thành một \r\nyêu cầu bức thiết của mọi quốc gia, dân tộc. Phạm Công Trứ, nhà \r\nchính trị danh tiếng, nhà sử học sống ở thế kỳ XVII, trong bài Tựa\r\nsách Đại Việt sử ký bản kỷ tục biên viết: \"Vì sao mà làm quốc sử?\r\nVĩ sử chù yếu là để ghi chép sự việc. Có chinh trị cùa một đời tất\r\nphải có sử của một đời. Mà ngòi bút chép sử giữ nghị luận rất\r\nnghiêm, ca ngợi đời thịnh trị thì sáng tỏ ngang với mặt trời, mặt\r\ntrăng, lên án kẻ loạn tặc thì gay gắt nhu sương thu lạnh buốt,\r\nngười thiện biết có thể bắt chước, người ác biết có thể tự răn, quan\r\nhệ đến việc chính trị không phải là không nhiều. Cho nên làm sử là\r\ncốt để cho được như thế\"'.\r\nViệt Nam là một dân tộc có lịch sử lâu đời. Việt Nam cũng là \r\nmột dân tộc yêu sử và có rất nhiều người ham thích tìm tòi, nghiên \r\ncứu và biên soạn lịch sử. Đã có nhiều công trình lịch sử được công \r\nbố, không chi do các cơ quan, tổ chức chuyên nghiên cứu biên \r\nsoạn, mà còn do cá nhân người yêu sử thực hiện... Điều này vừa có \r\nmặt tích cực, lại cỏ mặt tiêu cực. Tích cực vì sẽ góp phần giúp nhân \r\ndân hiểu thêm về lịch sử nước nhà, nhưng cũng chứa đựng yếu tố \r\ntiêu cực là dễ dẫn tới những hiểu biết phiến diện, sai lầm về lịch \r\nsử... đôi khi đồng nhất truyền thuyết với lịch sử?", "LỊCH SỪ VIỆT NAM - TẬP 11\r\ngiầu mạnh hcm nhờ chiến tranh. Những nước bại trận như Đức, Ý, \r\nNhật thì kiệt quệ. Song dù thắng hay bại, sự kết thúc chiến tranh đặt \r\ncho mỗi nước những yêu cầu cấp bách cần giải quyết, tạo nên \r\nnhững đặc trưng kinh tế - xã hội ở nhóm nước này.\r\nSau chiến tranh thế giới, những nưóc công nghiệp chủ nghĩa \r\nchâu Âu và Nhật Bản đều bị chiến tranh tàn phá nặng nề. Nhiệm vụ \r\nquan trọng của họ ỉà hàn gắn vết thương chiến tranh, khôi phục \r\nkinh tế, ổn định đời sống xã hội. Đối với Mỹ, nhiệm vụ chủ yếu là \r\nphải chuyển hướng vận hành kinh tế từ một nền kinh tế phục vụ \r\nquân sự thời chiến sang nền kinh tế thời bình.\r\nNhừng nét cơ bản của tình hình thế giới nêu trên đã tác động \r\nđến hầu hết các khu vực trên thế giới, đặc biệt là khu vực Châu Á \r\nvà Đông Nam Á, tạo điều kiện thuận lợi cho cuộc đấu tranh giải \r\nphóng của các dân tộc Đông Dương. Từ đầu những năm 1950, tình \r\nhình cách mạng ba nước Đông Dương chuyển biến nhanh chóng. \r\nVới cuộc đi thăm Trung Quốc, Liên Xô của Chủ tịch Hồ Chí Minh \r\nđầu năm 1950 và việc các nước xã hội chủ nghĩa công nhận và đặt \r\nquan hệ ngoại giao với Chính phủ Việt Nam Dân chủ Cộng hòa là \r\nmột thắng lợi ngoại giao vô cùng quan trọng. Thắng lợi về ngoại \r\ngiao này đã chấm dứt thời kỳ chiến đấu đom độc, hầu như bị cách ly \r\nvới bên ngoài và từ đó tiếp nhận được sự đồng tình về chính trị và \r\nsự viện trợ về vật chất.\r\nVới sự giúp đỡ của Liên Xô, Trung Quốc và các nước xã hội"]}, {"source_sentence": "Chức năng của quan Đốc học trong việc quản lý giáo dục ở các tỉnh là gì?", "sentences": ["Định, Phú Yên, Biên Hoà, Gia Định, Vĩnh Long, Định Tường, An \r\nGiang đều đặt mỗi tỉnh một quan Đốc học coi việc học chính trong \r\ntinh. Các tỉnh từ Quảng Trị, Quảng Bình, Hà Tĩnh, Nghệ An, \r\nThanh Hoá, Ninh Bình, Nam Định, Hà Nội, Hưng Yên, Hải Dương, \r\nSơn Tây, Bắc Ninh cũng đều đật chức Đốc học. Tinh nào khuyết \r\nchức Đốc học thì đặt Thự đốc học tạm quyền đốc học một thời gian \r\nđổ phụ trách, đôn đốc việc học trong tỉnh.\r\nCác tỉnh Khánh Hoà, Bình Thuận, Hà Tiên, Quảng Yên, Hưng \r\nHoá, Tuyên Quang, Thái Nguyên, Lạng Sơn, Cao Bằng, do số học \r\nsinh ít nên đến cuối thời Thiệu Trị (1847) vẫn chưa đặt chức Đốc học.\r\nTheo lệ Nhà nước chế cấp ấn quan phòng giao cho Đốc học lo \r\nviệc học chính trong địa hạt của tinh sờ tại, trong đó có việc xây \r\ndựng trường sở ở tinh, phù, hoặc huyện, châu; sắp xếp các thày \r\ngiáo và tuyển chọn học sinh vào học ở các trường. Những công \r\nviệc licn quun đén việc học đểu có sự phối hựp giữa quan Đốc hục \r\nvới các viên giữ chức Giáo thụ ở các phủ và Huấn đạo ờ các huyện, \r\nchâu. Một bộ máy giáo dục được tổ chức chặt chẽ theo ngành dọc \r\ntừ tinh đến phủ, huyện, châu; tổng (ở tổng có Tổng giáo) để theo \r\ndõi, đôn đốc việc giảng dạy và học tập, đã góp phần đẩy mạnh hom \r\nviệc giáo dục ở những triều vua Nguyễn nửa đầu thế kỳ XIX. Những \r\nthành tích của giáo dục bấy giờ biểu hiện rõ nhất ở việc Nhà nước \r\ncứ 3 năm lại mở một kỳ thi Hương ờ một số tinh thuộc Bác Kỳ (Nam \r\nĐịnh, Hài Dương, Thăng Long); Nghệ An; kinh đô Huế; Trung Kỳ", "Trước tình hình thế giới và trong nước ngày càng khẩn trương, ngày 28 - I - 1941,\r\nlãnh tụ Nguyễn Ái Quốc về nước triệu tập Hội nghị lần thứ 8 Ban Chấp hành\r\nTrung ương Đảng Cộng sản Đông Dương. Hội nghị họp tại Pác Bó (Cao Bằng) từ\r\nngày 10 đến ngày 19 - 5 - 1941.\r\nHội nghị chủ †rương trước hết phởi giỏi phóng cho được cóc dôn tộc\r\nĐông Dương ro khỏi éch Phớp - Nhột. Hội nghị quyết định tiếp tục tạm\r\ngóc khổu hiệu “Đónh đổ địa chủ, chia ruộng đốt cho dôn còy” thay bằng\r\ncóc khổu hiệu “Tịch thu ruộng đốt của đế quốc vò Việt gian chia cho dên\r\ncòy nghèo, giởm †ô, giỏm tức, chia lợi ruộng công”, tiến tới thực hiện\r\n“Người còy có ruộng”. Hội nghị chủ trương †hònh lộp Việt Nơm độc lập\r\nđồng minh (gọi tốt lò Việt Minh) bao gồm céc †ổ chức quồn chúng, lốy\r\ntên lò Hội Cứu quốc nhồm : “Liên hiệp hết thỏy cóc giới đồng bèo yêu\r\nnước, không phôn biệt giòu nghèo, giò trẻ, gới trai, không phôn biệt tôn\r\ngiáo vò xu hướng chính trị, đặng cùng nhau mưu cuộc dôn tộc giỏi phóng\r\nvò sinh tồn” °°,\r\n\r\nMặt trận Việt Minh chính thức thành lập ngày 19 - 5 - 1941. Chỉ sau một thời\r\ngian ngắn, tổ chức này đã có uy tín và ảnh hưởng sâu rộng trong nhân dân. Sau Hội\r\nnghị Trung ương, lãnh tụ Nguyễn Ái Quốc đã gửi thư kêu gọi đồng bào cả nước\r\nđoàn kết thống nhất đánh đuổi Pháp - Nhật.", "\"Chính sự ngày một đổ nát, đói kém xảy ra luôn luôn. Nhân dân cùng\r\nquân, khốn khổ, giặc cướp nổi lên ở nhiễu nơi\".\r\n(Khâm định Việt sử thông giám cương mục)\r\n\r\nỞ Nghệ An, Thanh Hoá, Ninh Bình,... dân nghèo nổi dậy đấu tranh. Trong\r\ntình hình đó, một số thế lực phong kiến ở các địa phương lại đánh giết lẫn\r\nnhau, quấy phá nhân dân và chống lại triều đình. Nhà Lý phải dựa vào thế lực\r\nhọ Trần để chống lại các lực lượng nổi loạn nên đã tạo điều kiện và thời cơ cho\r\nhọ Trần buộc Chiêu Hoàng (vua cuối cùng của nhà Lý) phải nhường ngôi cho\r\nTrần Cảnh vào tháng 12, năm Ất Dậu (đâu năm 1226).\r\n\r\n(1) Việc thổ mộc : việc làm nhà cửa, chùa, đền, đào sông, hồ..."]}, {"source_sentence": "Thiệu Trị đã xử lý trường hợp của Lý Văn Phức và việc người Pháp bắt giữ thuyền quân đi tuần biển của Việt Nam ra sao?", "sentences": ["hóa; thuế độc quyền; thué điền thổ...\r\nTheo những con số thống kê chính thức thì các loại thuế trên \r\nđều tăng lên đáng kể, khoảng từ ba đến hơn ba lần vào năm 1945 \r\n(số dự thu) so với năm 1939 (số thực thu) như sau:\r\nBảng 29: Thu nhập từ một sổ loại thuế ở Đông Dương \r\ntrong các năm 1939 và 19453\r\nĐom vị: nghìn đồng\r\nThuế 1939 1945\r\nThuế tiêu thụ và vận chuyển hàng hoá 20.655.000 58.265.000\r\nThuế muối, rượu, thuốc phiện, diêm, pháo,\r\nthuốc lá\r\n24.694.000 87.000.000\r\nThuế điền thổ, trước bạ 11.821.000 28.625.000\r\nvề thuốc phiện, do việc nhập khẩu bị ngừng, Pháp khuyến khích \r\nnhân dân thượng du trồng loại cây này nên số thuốc phiện sản xuất \r\nđược ngày một tăng: năm 1940: 7.560kg; nãm 1941: 17.344kg; năm\r\n1. Annuaire statistique de V Union f,rariỊaise Outre- mer 1939-1946, tr. K -\r\n90-93.\r\n2, 3. Annuaire statistique de runion firanẹaise Outre - mer 1939-1946, tr.\r\nK-90.\r\n552", "Chương I. Chính sách thuộc địa của Pháp..\r\nbộ đồng bào các dân tộc thiểu số. về phương diện này, chính quyền \r\nthuộc địa còn muốn đi xa hơn là cố định đồng bào vào một không \r\ngian nhất định, rồi đưa họ đến với chế độ sở hữu ruộng đất - chế độ \r\nsở hữu tập thể và ấn định cho họ một chế độ thuế khóa.\r\nNhư vậy, “chính sách thâm nhập” có xuất phát điểm là chính \r\nsách “chia đế trf' và mục tiêu là tách các dân tộc thiểu số ra khỏi \r\ndân tộc Kinh, dùng dân tộc nọ chống lại dân tộc kia và nhằm một \r\nmục đích cao hơn là từ chinh phục, khuất phục về chính trị để tiến \r\nsang khai thác, bóc lột về đất đai, nhân công và thuế khóa của các \r\nđồng bào.\r\n7. Một số “cải cách” xã hội khác liên quan đến nông dân và\r\ncông nhân\r\nLiên quan đến nông dân, trong bài diễn văn về Tinh hình Đông\r\nDương và tuyên bo cải cách vào tháng 9/19301, Pierre Pasquier nêu \r\nra những vấn đề như: thi hành luật điền thổ, giúp nông dân Nam Kỳ \r\nthế chấp ruộng đất để vay tín dụng ngân hàng; dẫn thủy nhập điền, \r\nlàm thuỷ lợi để tăng diện tích canh tác, cải tiến kỹ thuật trồng trọt; \r\ngiúp nông dân thăng tién về sờ hữu ruộng đất (từ người không có \r\nđất lên tiểu điền chủ); mở rộng việc nhượng đất, khẩn hoang ở \r\nnhững vùng rừng núi ở Bắc và Trung Kỳ cũng như ở phía tây và \r\nnam Nam Kỳ; quy định lại chế độ lĩnh canh để \"hạn ché bớt sự bóc\r\nlột cùa địa chù đoi với tá điền”.\r\nTriển khai những “cải cách” này, Pierre Pasquier cho tiếp tục \r\nxây dựng các công trình thuỷ nông, rồi thành lập Hội đồng Khẩn", "theo vài mươi người, đeo gươm, đeo súng, đến thẳng ngay công \r\nquán, đưa ra một lá thư của nước Pháp bằng chữ Hán, lời lẽ ngang \r\nngược. Lý Văn Phức không nhận thư, Lạp Biệt Nhĩ quát to doạ nạt, \r\nđể lại thư xuống ghế rồi đi. Lý Văn Phức và Nguyễn Đình Tân bàn \r\nvới nhau rằng: \"Nhận lấy thư là có tội, mà đốt thư đi cũng có tội, \r\nkhông gì bằng cho chạy trạm về đệ tâu lên\". Lý Văn Phức về Kinh,\r\n1. Thực lục, tập VI, sđd, tr. 301.\r\n492\r\nChương VII. Quan hệ đối ngoại\r\nThiệu Trị giận là làm mất quốc thể, sai vệ cẩm y đóng gông đem \r\ngiam ở Tà đãi lậu, bắt giải chức, giao cho đình thần bàn.\r\nKhi ấy, bọn Pháp ngày thường lên bờ, ngông nghênh đi lại các \r\nnơi giao tiếp với dân đi đạo. Những thuyền quân đi tuần biển bị \r\nchúng bắt giữ lại ở cừa biển và cướp lấy buồm thuyền và dây buộc \r\nthuyền cùa 5 chiếc thuyền bọc đồng ở Kinh phái đi Nam (Kim \r\nƯng, Phấn Bằng, Linh Phượng, Thọ Hạc, Vân Bằng) đậu ở vụng \r\nTrà Sơn, đối diện vói chiến thuyền Pháp.\r\nViệc báo lên, Thiệu Trị sai ngay Đô thống Hữu quân Mai Công \r\nNgôn, Tham tri Bộ Hộ Đào Trí Phú đem biền binh 3 vệ Vũ lâm, Hổ \r\noai, Hùng nhuệ đến Quảng Nam cùng với lực lượng thủy, bộ tại \r\nchỗ tổ chức bố phòng. Thiệu Trị truyền chi căn dặn Mai Công \r\nNgôn và Đào Trí Phú rằng: \"Người Tây dương nếu đã sợ uy, thu \r\nhình, thì ta không nên tự động thủ trước; nếu chúng sinh chuyện \r\ntrước, thì đốc sức thành đài cùng biền binh các hiệu thuyền và \r\nthuyền đồng do Kinh phái đi, ngoài hợp, trong ứng, lập tức đánh"]}, {"source_sentence": "Gia Cát Lượng đã giúp ai trong việc quản lý nước Thục?", "sentences": ["phải trông coi mọi việc, giúp Thành Vương đến lúc trưởng thành. \r\n4\r\n Hoắc Quang giữ chức Đại tư mã tướng quân, phò Hán Chiêu Đế lúc lên ngôi mới 9 tuổi. \r\n5\r\n Gia Cát Lượng tức Khổng Minh, là thừa tướng của Chiêu Đế Lưu Bị nước Thục đời Tam Quốc. Lưu Bị chết, con là Lưu Thiện nối \r\nngôi, tức Thục Hậu chúa, mọi việc nước, việc quân đều phải trông cậy vào Gia Cát Lượng. \r\n6\r\n Tô Hiến Thành là Thái úy triều Lý Cao Tông, nhận di mệnh Cao Tông phò vua nhỏ là Long Cán lên nối ngôi mới 3 tuổi. \r\n7\r\n Tứ phụ: nghĩa là bốn viên đại thần giúp vua khi mới lên ngôi. \r\n8\r\n Chỉ Thuận Tông. \r\n9\r\n Xích chủy: nghĩa là mõm đỏ, miệng đỏ, hay đỏ mỏ. Xích chủy hầu là loài đỏ mỏ ám chỉ Lê Quý Ly. \r\n10 Bạch kê: nghĩa là gà trắng. Nghệ Tông sinh năm Tân Dậu, tức năm gà. Tân thuộc hành kim, loài kim sắc trắng. Vì thế \"bạch kê\" \r\nám chỉ Nghệ Tông. \r\n11 Chữ vương? ở trong lòng chữ khẩu? là chữ \"quốc\"?. \r\n12 Theo tục nhà Trần, hằng năm vào ngày mồng 4 tháng 4, vua hội họp bề tôi làm lễ tuyên thệ ở đền Đồng Cổ. (Xem bản kỷ, quyển \r\n5, Kiến Trung năm thứ 3, 1277). \r\n13 Chỉ Quý Ly. \r\n288 Đại Việt Sử Ký Toàn Thư - Bản Kỷ - Quyển VIII \r\nQuý Ly bỏ mũ, rập đầu khóc lóc từ tạ, chỉ trời vạch đất thề rằng: \r\n\"Nếu thần không biết dốc lòng trung, hết sức giúp Quan gia để truyền đến con cháu về sau thì \r\ntrời sẽ ghét bỏ thần\". \r\nQuý Ly lại nói: \"Lúc Linh Đức Vương làm điều thất đức, nếu không nhờ oai linh bệ hạ thì thần đã", "éo, xênh xang lạ hom cả\", và gánh xiếc của BẮc thành trổ tài dịp Đại \r\nkhánh \"Ngũ tuần\" của vua: \"4 đứa leo dây, đứa trẻ lộn dây, đứa trẻ \r\nmúa trên bàn tay 2 đứa\".\r\nNhững định chế về tổ chức và hoạt động nghệ thuật của nhà \r\nNguyễn đã có tác dụng quan ữọng kích thích các loại hình vãn nghệ \r\ndân gian phát triển cả về số lượng lẫn chất lượng. Trong các đợt biểu \r\ndiễn ở Kinh đô, trước yêu cầu thưởng lãm nghiêm ngặt và cao hơn \r\nđịa phương, các nhà viết kịch bản. đạo diễn, diễn viên phải trau dồi để \r\nnâng cao năng lực sáng tác, dàn dựng và kỹ năng biểu diễn.\r\n2. Nghệ thuật dân gian\r\nSinh hoạt văn nghệ dân gian trong các làng quê cũng phát triển. \r\nỞ Bắc Kỳ, Bắc Trung Kỳ, hát ả đào rất phổ biến. Bên cạnh đó là \r\ncác thể loại dân ca: hát Xoan ở Phú Thọ, Quan họ Bắc Ninh, hát \r\nSli, Then ở Lạng Sơn, hát Ví dặm, Phường vải ở Nghệ An, Hà \r\nTĩnh. Ở các tinh trung du và đồng bằng Bắc Bộ, Thanh Hóa, chèo \r\nsân đình mang tính trào lộng nở rộ. Thể loại trò hài, xiếc ở Bắc Kỳ \r\ncũng thu hút đông đảo khán giả.\r\n639", "Tây. Ngoài cơ sờ đúc súng cũ của tiên triều, năm 1825 vua Minh \r\nMệnh mờ thêm sáu xưởng nữa. vốn cần cù và ham học hỏi sáng \r\ntạo, những người thợ quân giới đã được \"thứ súng tay nạp thuốc nổ \r\nmạnh theo kiểu Tây dương\". Vào những năm cuối triều Minh \r\nM ệnh, họ đã đúc 15 cỗ đại pháo X ung tiêu băng đồng và hai cỗ \r\nsúng lớn Chấn hải, loại đại pháo lợi hại trong thủy chiến phương \r\nTây. Sau đó, lại xuất xưởng tiếp 30 cỗ Chấn hải. Năm 1829, quản \r\nkho Hải Dương là Tôn Thất Thiện cùng với 100 lính Chấn cơ chế \r\nra cối gỗ chạy bàng sức nước ở khe suối để giã, luyện thuốc súng. \r\nDụng cụ này là xe \"Thủy hỏa ký tế\", và những năm sau được phổ \r\ncập trong quân ngũ. Từ vũ khí phương Tây, người Đại Nam đã tự \r\ntìm hiểu từng chi tiết để chế tạo thước đo ngắm bắn, thước kiểm tra \r\nthuốc súng. Trong bảy năm ờ ngôi, vua Thiệu Trị đúc 9 cỗ súng \r\nbàng đồng hiệu là \"Thần uy phục viễn đại tướng quân\", cỗ to nhất \r\nlà 10.706 cân, cỗ nhỏ nhất là 10.222 cân, tổng cộng là 93.829 cân.\r\n649\r\nLỊCH SỬ VIỆT NAM - TẬP 5\r\nVà ba cỗ súng hiệu \"Bảo Đại định công an dân hòa chúng thượng \r\ntướng quân\", mỗi cỗ trên 14.500 cân, tổng cộng là 43.620 cân1.\r\nĐe tạo điều kiện cho quân thủy học tập, bộ Công cấp cho họ la \r\nbàn, thước đo nước, đồng hồ cát xem giờ của phương Tây. v ề khoa \r\nmục bắn súng thì lính thủy phải tập bắn súng điểu sang và đại bác. \r\nMinh Mệnh yêu cầu Hiệp biện Đại học sĩ lãnh Thượng thư bộ Binh \r\nTrương Đăng Quế đọc kỹ các sách và bản đồ thủy chiến \"Tây"]}], "model-index": [{"name": "SentenceTransformer based on Alibaba-NLP/gte-multilingual-base", "results": [{"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "gte multilingual base 768", "type": "gte_multilingual_base_768"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.3972602739726027, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.6333333333333333, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.7132420091324201, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.7817351598173516, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.3972602739726027, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.21111111111111108, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.142648401826484, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.07817351598173515, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.3972602739726027, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.6333333333333333, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.7132420091324201, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.7817351598173516, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.5921213055171655, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.5309868087265359, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.537969151887342, "name": "Cosine Map@100"}]}, {"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "gte multilingual base 512", "type": "gte_multilingual_base_512"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.38767123287671235, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.6310502283105023, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.7095890410958904, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.7821917808219178, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.38767123287671235, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.21035007610350073, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.14191780821917807, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.07821917808219177, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.38767123287671235, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.6310502283105023, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.7095890410958904, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.7821917808219178, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.5879636635574841, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.525339204174821, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.5318727014135456, "name": "Cosine Map@100"}]}, {"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "gte multilingual base 256", "type": "gte_multilingual_base_256"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.3771689497716895, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.6146118721461187, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.6872146118721462, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.7662100456621005, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.3771689497716895, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.20487062404870623, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.13744292237442923, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.07662100456621006, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.3771689497716895, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.6146118721461187, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.6872146118721462, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.7662100456621005, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.5736037026704126, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.5116503587736474, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.5189035063838257, "name": "Cosine Map@100"}]}, {"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "gte multilingual base 128", "type": "gte_multilingual_base_128"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.36118721461187214, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.582648401826484, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.6502283105022831, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.7342465753424657, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.36118721461187214, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.1942161339421613, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.1300456621004566, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.07342465753424657, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.36118721461187214, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.582648401826484, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.6502283105022831, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.7342465753424657, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.5465887777560341, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.4866068710589268, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.49427672079491064, "name": "Cosine Map@100"}]}, {"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "gte multilingual base 64", "type": "gte_multilingual_base_64"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.3082191780821918, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.5146118721461187, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.5863013698630137, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.6621004566210046, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.3082191780821918, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.17153729071537288, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.11726027397260275, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.06621004566210045, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.3082191780821918, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.5146118721461187, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.5863013698630137, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.6621004566210046, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.4843188931282978, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.4275081539465107, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.4370689716929827, "name": "Cosine Map@100"}]}]}]} |
legaltextai/modernbert-embed-ft-const-legal-matryoshka | legaltextai | sentence-similarity | [
"sentence-transformers",
"safetensors",
"modernbert",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:842",
"loss:MatryoshkaLoss",
"loss:MultipleNegativesRankingLoss",
"en",
"arxiv:1908.10084",
"arxiv:2205.13147",
"arxiv:1705.00652",
"base_model:nomic-ai/modernbert-embed-base",
"base_model:finetune:nomic-ai/modernbert-embed-base",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| 2025-02-17T00:05:31 | 2025-02-17T00:06:03 | 29 | 1 | ---
base_model: nomic-ai/modernbert-embed-base
language:
- en
library_name: sentence-transformers
license: apache-2.0
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:842
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: Discuss the implications of the Insular Cases on the application
of the Citizenship Clause to American Samoa, particularly in distinguishing between
incorporated and unincorporated territories. What are the practical concerns associated
with this distinction?
sentences:
- 'To the extent jus soli is adopted into the Fourteenth Amendment, the concept
of allegiance is manifested by the Citizenship Clause’s mandate that birthright
citizens not merely be born within the territorial boundaries of the United States
but also “subject to the jurisdiction thereof…” [citations omitted]
Appellants would find any allegiance requirement of no moment because, as non-citizen
nationals, American Samoans already “owe[ ] permanent allegiance to the United
States.”[citations omitted] Yet, within the context of the Citizenship Clause,
“[t]he evident meaning of the[ ] ... words [“subject to the jurisdiction thereof”]
is, not merely subject in some respect or degree to the jurisdiction of the United
States, but completely subject to their political jurisdiction, and owing them
direct and immediate allegiance.” **375 [citations omitted] *306 It was on this
basis that the Supreme Court declined to extend constitutional birthright citizenship
to Native American tribes. [citations omitted]…Even assuming a background context
grounded in principles of jus soli, we are skeptical the framers plainly intended
to extend birthright citizenship to distinct, significantly self-governing political
territories within the United States’s sphere of sovereignty—even where, as is
the case with American Samoa, ultimate governance remains statutorily vested with
the United States Government. [citations omitted]
III
Analysis of the Citizenship Clause’s application to American Samoa would be incomplete
absent invocation of the sometimes contentious Insular Cases, where the Supreme
Court “addressed whether the Constitution, by its own force, applies in any territory
that is not a State.” [citations omitted]
“The doctrine of ‘territorial incorporation’ announced in the Insular Cases distinguishes
between incorporated territories, which are intended for statehood from the time
of acquisition and in which the entire Constitution applies ex proprio vigore,
and unincorporated territories [such as American Samoa], which are not intended
for statehood and in which only [certain] fundamental constitutional rights apply
by their own force.”[citations omitted].
Appellants and Amici contend the Insular Cases have no application because the
Citizenship Clause textually defines its own scope.[citations omitted].
Amici Curiae suggest territorial incorporation doctrine should not be expanded
to the Citizenship Clause because the doctrine rests on anachronistic views of
race and imperialism. But the Court has continued to invoke the Insular framework
when dealing with questions of territorial and extraterritorial application. [citations
omitted] Although some aspects of the Insular Cases’ analysis may now be deemed
politically incorrect, the framework remains both applicable and of pragmatic
use in assessing the applicability of rights to unincorporated territories. [citations
omitted]
As the Supreme Court…emphasized, the “common thread uniting the Insular Cases
... [is that] questions of extraterritoriality turn on objective factors and practical
concerns, not formalism.” [citations omitted] While “fundamental limitations in
favor of personal rights” remain guaranteed to persons born in the unincorporated
territories, [citations omitted], the Insular framework recognizes the difficulties
that frequently inure when “determin[ing] [whether a] particular provision of
the Constitution is applicable,” absent inquiry into the impractical or anomalous.
[citations omitted]
A
American citizenship “is one of the most valuable rights in the world today.”
[citations omitted] “The freedoms and opportunities secured by United States citizenship
long have been treasured by persons fortunate enough to be born with them, and
are yearned for by countless less fortunate.” [citations omitted]. Accordingly,
even if the Insular framework is applicable, Appellants cite to a bevy of cases
to argue citizenship is a fundamental right. [citations omitted] But those cases
do not arise in the territorial context. Such decisions do not reflect the Court’s
considered judgment as to the existence of a fundamental right to citizenship
for persons born in the United States’ unincorporated **377 *308 territories.
[citations omitted].7
“Fundamental” has a distinct and narrow meaning in the context of territorial
rights. It is not sufficient that a right be considered fundamentally important
in a colloquial sense or even that a right be “necessary to [the] [ ]American
regime of ordered liberty.” [citations omitted]. Under the Insular framework the
designation of fundamental extends only to the narrow category of rights and “principles
which are the basis of all free government.” [citations omitted]
In this manner the Insular Cases distinguish as universally fundamental those
rights so basic as to be integral to free and fair society.'
- '633, 649 (concurring opinion).
An innkeeper or common carrier has always been allowed to'' exclude drunks, criminals
and'' diseased persons, but only because the public’s interest in protecting his
and his guests’ health and property outweighs its interest in providing accommodations
for this small group of travelers. As a general rule, innkeepers and carriers
cannot refuse their services on account of race; though the rule developed in
this country that they can provide “separate but equal” facilities. And for a
period of our history even,this Court upheld state laws giving sanction to such
a rule. Compare Plessy v. Ferguson, 163 U. S. 537, with Gayle v. Browder, 352
U. S. 903, affirming, 142 F. Supp. 707. But surely Shelley v. Kraemer, supra,
and Barrows v. Jackson, supra, show that the day has passed when an innkeeper,
carrier, housing developer, or retailer can draw a• racial'' line, refuse service
to some on account of color, and obtain the aid of a State in enforcing his personal
bias by sending outlawed customers to prison or exacting fines from them.
Business, such as this restaurant, is still private property. '' Yet there is
hardly any private enterprise that does not feel the pinch of some public regulation
— from price control, to health and fire inspection, to zoning, to safety measures,
to minimum wages and working conditions, to unemployment insurance. When the doors
of a business are open to the public, they must be open to all regardless of race
if apartheid is not to become engrained in our public places. It cannot by reason
of the Equal Protection Clause become so engrained with the aid of state courts,
state legislatures, or state police.
II.
There is even greater reason to bar a State through its judiciary from throwing
its weight on the side of racial discrimination in the present case, because we
deal here with a place of public accommodation under license from, the State.
This is the idea I expressed in Garner v. Louisiana, 368 U. S. 157, where another
owner of a restaurant refused service to a customer because he was a Negro. That
view is not novel; it.stems from the dissent of the first Mr. Justice Harlan in
the Civil Rights Cases, 109 U. S. 3, 58-59:
“In every material sense applicable to the practical enforcement of the Fourteenth
Amendment, railroad corporations, keepers of inns, and managers of places of public
amusement are agents or instrumentalities of the State, because they are charged
with duties to the public, and are amenable, in respect of their duties and functions,
to governmental regulation. It seems to me that, within the principle settled
in Ex parte Virginia, a denial, by these instrumentalities of the State, to the
citizen, because of his race, of that equality of civil rights secured to him
by law, is a denial by the State, within the meaning of the Fourteenth Amendment.
If it be not, then that race is left, in respect of the civil rights in question,
practically at the mercy of corporations and individuals wielding power under
the States.”
The nexus between the State and the private enterprise may be control, as in the
case of a state agency. Pennsylvania v. Board of Trusts, 353 U. S. 230. Or the
nexus may be one of numerous other devices. “State support of segregated schools
through any arrangement, management, funds, or property cannot be squared” with
the Equal Protection Clause. Cooper v. Aaron, 358 U. S. 1, 19. Cf. Hampton v.
Jacksonville, 304 F. 2d 320. A state-assisted enterprise serving the public does
not escape its constitutional duty to serve all customers irrespective of race,
even though its actual operation is in the hands of a lessee. Burton v. Wilmington
Parking Authority, 365 U. S. 715. Cf. Boynton v. Virginia, 364 U. S. 454. State
licensing and surveillance.of a business serving the public also brings its service
into the public domain. This restaurant needs a permit from Louisiana to operate;
and during the existence of the license the State has broad powers of visitation
and control. This restaurant is thus an instrumentality of the State since the
State charges it with duties to the public and supervises its performance. The
State''s interest in and activity with regard to its restaurants extends far beyond
any mere income-producing licensing requirement.'
- 'Among other things, courts at this second step have sometimes considered whether
an employee’s speech interests are outweighed by “ ‘the interest of the State,
as an employer, in promoting the efficiency of the public services it performs
through its employees.’ ” Id., at 417, 126 S.Ct. 1951 *2424 (quoting Pickering,
391 U.S. at 568, 88 S.Ct. 1731).
Both sides ask us to employ at least certain aspects of this Pickering–Garcetti framework
to resolve Mr. Kennedy’s free speech claim. They share additional common ground
too. They agree that Mr. Kennedy’s speech implicates a matter of public concern.
See App. to Pet. for Cert. 183; Brief for Respondent 44. They also appear to accept,
at least for argument’s sake, that Mr. Kennedy’s speech does not raise questions
of academic freedom that may or may not involve “additional” First Amendment “interests”
beyond those captured by this framework. Garcetti, 547 U.S. at 425, 126 S.Ct.
1951; see also Keyishian v. Board of Regents of Univ. of State of N. Y., 385 U.S.
589, 603, 87 S.Ct. 675, 17 L.Ed.2d 629 (1967); Brief for Petitioner 26, n. 2.
At the first step of the Pickering–Garcetti inquiry, the parties’ disagreement
thus turns out to center on one question alone: Did Mr. Kennedy offer his prayers
in his capacity as a private citizen, or did they amount to government speech
attributable to the District?
Our cases offer some helpful guidance for resolving this question. In Garcetti,
the Court concluded that a prosecutor’s internal memorandum to a supervisor was
made “pursuant to [his] official duties,” and thus ineligible for First Amendment
protection. 547 U.S. at 421, 126 S.Ct. 1951. In reaching this conclusion, the
Court relied on the fact that the prosecutor’s speech “fulfill[ed] a responsibility
to advise his supervisor about how best to proceed with a pending case.” Ibid.
In other words, the prosecutor’s memorandum was government speech because it was
speech the government “itself ha[d] commissioned or created” and speech the employee
was expected to deliver in the course of carrying out his job. Id., at 422, 126
S.Ct. 1951.
By contrast, in Lane a public employer sought to terminate an employee after he
testified at a criminal trial about matters involving his government employment.
573 U.S. at 233, 134 S.Ct. 2369. The Court held that the employee’s speech was
protected by the First Amendment. Id., at 231, 134 S.Ct. 2369. In doing so, the
Court held that the fact the speech touched on matters related to public employment
was not enough to render it government speech. Id., at 239–240, 134 S.Ct. 2369.
Instead, the Court explained, the “critical question ... is whether the speech
at issue is itself ordinarily within the scope of an employee’s duties.” Id.,
at 240, 134 S.Ct. 2369. It is an inquiry this Court has said should be undertaken
“practical[ly],” rather than with a blinkered focus on the terms of some formal
and capacious written job description. Garcetti, 547 U.S. at 424, 126 S.Ct. 1951.
To proceed otherwise would be to allow public employers to use “excessively broad
job descriptions” to subvert the Constitution’s protections. Ibid.
Applying these lessons here, it seems clear to us that Mr. Kennedy has demonstrated
that his speech was private speech, not government speech. When Mr. Kennedy uttered
the three prayers that resulted in his suspension, he was not engaged in speech
“ordinarily within the scope” of his duties as a coach. Lane, 573 U.S. at 240,
134 S.Ct. 2369. He did not speak pursuant to government policy. He was not seeking
to convey a government-created message. He was not instructing players, discussing
strategy, encouraging better on-field performance, or engaged in any other speech
the District paid him to produce as a coach. See Part I–B, supra. Simply put:
Mr. Kennedy’s prayers did not “ow[e their] existence” to Mr. Kennedy’s responsibilities
as a public employee.'
- source_sentence: Discuss the implications of the Thirteenth Amendment as it relates
to Congress's power to enact laws against private racial discrimination in property
transactions. How does the text support the assertion that Congress's authority
extends beyond state action?
sentences:
- '––––, ––––, 142 S.Ct. 1539, 1545, ––– L.Ed.2d –––– (2022) (THOMAS, J., concurring)
(internal quotation*2301 marks omitted). Either way, the Due Process Clause at
most guarantees process. It does not, as the Court’s substantive due process cases
suppose, “forbi[d] the government to infringe certain ‘fundamental’ liberty interests
at all, no matter what process is provided.” Reno v. Flores, 507 U.S. 292, 302,
113 S.Ct. 1439, 123 L.Ed.2d 1 (1993); see also, e.g.,Collins v. Harker Heights,
503 U.S. 115, 125, 112 S.Ct. 1061, 117 L.Ed.2d 261 (1992).
As I have previously explained, “substantive due process” is an oxymoron that
“lack[s] any basis in the Constitution.” Johnson, 576 U.S. at 607–608, 135 S.Ct.
2551 (opinion of THOMAS, J.); see also, e.g.,Vaello Madero, 596 U.S., at ––––,
142 S.Ct., at 1545 (THOMAS, J., concurring) (“[T]ext and history provide little
support for modern substantive due process doctrine”). “The notion that a constitutional
provision that guarantees only ‘process’ before a person is deprived of life,
liberty, or property could define the substance of those rights strains credulity
for even the most casual user of words.” McDonald v. Chicago, 561 U.S. 742, 811,
130 S.Ct. 3020, 177 L.Ed.2d 894 (2010) (THOMAS, J., concurring in part and concurring
in judgment); see also United States v. Carlton, 512 U.S. 26, 40, 114 S.Ct. 2018,
129 L.Ed.2d 22 (1994) (Scalia, J., concurring in judgment). The resolution of
this case is thus straightforward. Because the Due Process Clause does not secure
any substantive rights, it does not secure a right to abortion.
The Court today declines to disturb substantive due process jurisprudence generally
or the doctrine’s application in other, specific contexts. Cases like Griswold
v. Connecticut, 381 U.S. 479, 85 S.Ct. 1678, 14 L.Ed.2d 510 (1965) (right of married
persons to obtain contraceptives)*; Lawrence v. Texas, 539 U.S. 558, 123 S.Ct.
2472, 156 L.Ed.2d 508 (2003) (right to engage in private, consensual sexual acts);
and Obergefell v. Hodges, 576 U.S. 644, 135 S.Ct. 2584, 192 L.Ed.2d 609 (2015)
(right to same-sex marriage), are not at issue. The Court’s abortion cases are
unique, see ante, at 2257 – 2258, 2277 – 2278, 2280 – 2281, and no party has asked
us to decide “whether our entire Fourteenth Amendment jurisprudence must be preserved
or revised,” McDonald, 561 U.S. at 813, 130 S.Ct. 3020 (opinion of THOMAS, J.).
Thus, I agree that “[n]othing in [the Court’s] opinion should be understood to
cast doubt on precedents that do not concern abortion.” Ante, at 2277 – 2278.
For that reason, in future cases, we should reconsider all of this Court’s substantive
due process precedents, including Griswold, Lawrence, and Obergefell. Because
any substantive due process decision is “demonstrably erroneous,” Ramos v.Louisiana,
590 U.S. ––––, ––––, 140 S.Ct. 1390, 1424, 206 L.Ed.2d 583 (2020) (THOMAS, J.,
concurring in judgment), we have a duty to “correct the error” established in
those precedents, Gamble v. United States, 587 U.S. ––––, ––––, 139 S.Ct. 1960,
1984-1985, 204 L.Ed.2d 322 (2019) (THOMAS, J., concurring).'
- 'On October 21, the superintendent further observed to a state official that “[t]he
issue is quickly changing as it has shifted from leading prayer with student athletes,
to a coaches [sic] right to conduct” his own prayer “on the 50 yard line.” Id.,
at 88.
On October 23, shortly before that evening’s game, the District wrote Mr. Kennedy
again. It expressed “appreciation” for his “efforts to comply” with the District’s
directives, including avoiding “on-the-job prayer with players in the ... football
program, both in the locker room prior to games as well as on the field immediately
following games.” Id., at 90. The letter also admitted that, during Mr. Kennedy’s
recent October 16 postgame prayer, his students were otherwise engaged and not
praying with him, and that his prayer was “fleeting.” Id., at 90, 93. Still, the
District explained that a “reasonable observer” could think government endorsement
of religion had occurred when a “District employee, on the field only by virtue
of his employment with the District, still on duty” engaged in “overtly religious
conduct.” Id., at 91, 93. The District thus made clear that the only option it
would offer Mr. Kennedy was to allow him to pray after a game in a “private location”
behind closed doors and “not observable to students or the public.” Id., at 93–94.
After the October 23 game ended, Mr. Kennedy knelt at the 50-yard line, where
“no one joined him,” and bowed his head for a “brief, quiet prayer.” 991 F.3d
at 1019; App. 173, 236–239. The superintendent informed the District’s board that
this prayer “moved closer to what we want,” but nevertheless remained “unconstitutional.”
Id., at 96. After the final relevant football game on October 26, Mr. Kennedy
again knelt alone to offer a brief prayer as the players engaged in postgame traditions.
443 F.Supp.3d 1223, 1231 (W.D. Wash. 2020); App. to Pet. for Cert. 182. While
he was praying, other adults gathered around him on the field. See 443 F.Supp.3d
at 1231; App. 97. Later, Mr. Kennedy rejoined his players for a postgame talk,
after they had finished singing the school fight song. 443 F.Supp.3d at 1231;
App. 103.
C
Shortly after the October 26 game, the District placed Mr. Kennedy on paid administrative
*2419 leave and prohibited him from “participat[ing], in any capacity, in ...
football program activities.” Ibid. In a letter explaining the reasons for this
disciplinary action, the superintendent criticized Mr. Kennedy for engaging in
“public and demonstrative religious conduct while still on duty as an assistant
coach” by offering a prayer following the games on October 16, 23, and 26. Id.,
at 102. The letter did not allege that Mr. Kennedy performed these prayers with
students, and it acknowledged that his prayers took place while students were
engaged in unrelated postgame activities. Id., at 103. Additionally, the letter
faulted Mr. Kennedy for not being willing to pray behind closed doors. Id., at
102.
In an October 28 Q&A document provided to the public, the District admitted that
it possessed “no evidence that students have been directly coerced to pray with
Kennedy.” Id., at 105. The Q&A also acknowledged that Mr. Kennedy “ha[d] complied”
with the District’s instruction to refrain from his “prior practices of leading
players in a pre-game prayer in the locker room or leading players in a post-game
prayer immediately following games.” Ibid. But the Q&A asserted that the District
could not allow Mr. Kennedy to “engage in a public religious display.” Id., at
105, 107, 110. Otherwise, the District would “violat[e] the ... Establishment
Clause” because “reasonable ... students and attendees” might perceive the “district
[as] endors[ing] ... religion.” Id., at 105.
While Mr. Kennedy received “uniformly positive evaluations” every other year of
his coaching career, after the 2015 season ended in November, the District gave
him a poor performance evaluation. Kennedy v. Bremerton School Dist., 869 F.3d
813, 820 (C.A.9 2017).'
- 'Nor was the scope of the 1866 Act altered when it was re-enacted in 1870, some
two years after the ratification of the Fourteenth Amendment.71 It is quite true
that some members of Congress supported the Fourteenth Amendment “in order to
eliminate doubt as to the constitutional validity of the Civil Rights Act as applied
to the States.” Hurd v. Hodge, 334 U.S. 24, 32—33, 68 S.Ct. 847, 852. But it certainly
does not follow that the adoption of the Fourteenth Amendment or the subsequent
readoption of the Civil Rights Act were meant somehow to limit its application
to state action. The legislative history furnishes not the slightest factual basis
for any such speculation, and the conditions prevailing in 1870 make it highly
implausible. For by that time most, if not all, of the former Confederate States,
then under the control of “reconstructed” legislatures, had formally repudiated
racial discrimination, and the focus of congressional concern had clearly shifted
from hostile statutes to the activities of groups like the Ku Klux Klan, operating
wholly outside the law.72
**2202 *437 Against this background, it would obviously make no sense to assume,
without any historical support whatever, that Congress made a silent decision
in 1870 to exempt private discrimination from the operation of the Civil Rights
Act of 1866.73 “The cardinal rule is that repeals by implication are not favored.”
Posadas v. National City Bank, 296 U.S. 497, 503, 56 S.Ct. 349, 352, 80 L.Ed.
351. All Congress said in 1870 was that the 1866 law “is hereby re-enacted.” That
is all Congress meant.
As we said in a somewhat different setting two Terms ago, “We think that history
leaves no doubt that, if we are to give (the law) the scope that its origins dictate,
we must accord it a sweep as broad as its language.” United States v. Price, 383
U.S. 787, 801, 86 S.Ct. 1152, 1160. “We are not at liberty to seek ingenious analytical
instruments,” ibid., to carve from s 1982 an exception for private conduct—even
though its application to such conduct in the present context is without established
precedent. And, as the Attorney General of the United States said at the oral
argument of this case, “The fact that the statute lay partially dormant for many
years cannot be held to diminish its force today.”
V.
The remaining question is whether Congress has power under the Constitution to
do what s 1982 purports to do: to prohibit all racial discrimination, private
and public, in the sale and rental of property. Our starting point is the Thirteenth
Amendment, for it was pursuant *438 to that constitutional provision that Congress
originally enacted what is now s 1982. The Amendment consists of two parts. Section
1 states:
“Neither slavery nor involuntary servitude, except as a punishment for crime whereby
the party shall have been duly convicted, shall exist within the United States,
or any place subject to their jurisdiction.”
Section 2 provides:
“Congress shall have power to enforce this article by appropriate legislation.”
As its text reveals, the Thirteenth Amendment “is not a mere prohibition of state
laws establishing or upholding slavery, but an absolute declaration that slavery
or involuntary servitude shall not exist in any part of the United States.” Civil
Rights Cases, 109 U.S. 3, 20, 3 S.Ct. 18, 28, 27 L.Ed. 835. It has never been
doubted, therefore, “that the power vested in Congress to enforce the article
by appropriate legislation,” ibid., includes the power to enact laws “direct and
primary, operating upon the acts of individuals, whether sanctioned by state legislation
or not.” Id., at 23, 3 S.Ct., at 30.74
Thus, the fact that s 1982 operates upon the unofficial acts of private individuals,
whether or not sanctioned by state law, presents no constitutional problem. If
Congress has power **2203 under the Thirteenth Amendment to eradicate conditions
that prevent Negroes from buying and renting property because of their race or
color, then no federal statute calculated to achieve that objective *439 can be
thought to exceed the constitutional power of Congress simply because it reaches
beyond state action to regulate the conduct of private individuals. The constitutional
question in this case, therefore, comes to this: Does the authority of Congress
to enforce the Thirteenth Amendment “by appropriate legislation” include the power
to eliminate all racial barriers to the acquisition of real and personal property?
We think the answer to that question is plainly yes.'
- source_sentence: According to the statute referenced in the context, what is the
standard for establishing the requisite injury necessary for obtaining an injunction
under 17 U.S.C. § 1203(b)(1)?
sentences:
- 'Post-Trial Mem. at 27-28.
[263] The statute expressly authorizes injunctions to prevent or restrain violations,
17 U.S.C. § 1203(b)(1), thus demonstrating that the requisite injury need only
be threatened.
[264] Def. Post-Trial Mem. at 28.
[265] Id. at 28-29.
[266] See, e.g., Ex. AYZ (Hunt Dep.) at 94-104.
[267] Id. 30.
[268] Ex. 113.
[269] Defendants'' argument would lack merit even if there were credible proof
that other circumvention devices actually exist and produce results comparable
to DeCSS. The available movies must have been decrypted with DeCSS or something
else. As far as this record discloses, any such device or technology would violate
the DMCA for the same reasons as does DeCSS. In consequence, this case comes within
the principle of Summers v. Tice, 33 Cal.2d 80, 199 P.2d 1 (1948). Where, as here,
two or more persons take substantially identical wrongful actions, one and only
one of which had to be the source of the plaintiffs'' injury, and it is equally
likely that one inflicted the injury as the other, the burden of proof on causation
shifts to the defendants, each of which is liable absent proof that its action
did not cause the injury. See 4 Fowler V. Harper & Fleming James, Jr., THE LAW
OF TORTS §§ 101-04 (2d ed.1996).
Defendants'' efforts to avoid the consequences of this common sense principle
are unpersuasive. They argue, for example, that plaintiffs may not invoke the
theory unless they join as defendants everyone who may have contributed to the
injury. Def. Post-Trial Mem. at 32 n. 18 (citing Ex. UZ). It would be difficult
to imagine a more nonsensical requirement in the context of this case. Where,
as here, harm is done by dissemination of information over the Internet, probably
by a substantial number of people all over the world, defendants'' proposed rule
would foreclose judicial relief anywhere because joinder of all plainly would
be impossible in any one place, and technology does not permit identification
of which wrongdoer''s posting or product led to which pirated copy of a copyrighted
work.
[270] 17 U.S.C. § 1203(b)(1).
[271] See, e.g., S.E.C. v. Unique Financial Concepts, Inc., 196 F.3d 1195, 1199
n. 2 (11th Cir.1999) (injunction under Section 20(b) of the Securities Act of
1933, 15 U.S.C. § 77t(b), which permits an injunction "upon a proper showing,"
requires "a reasonable likelihood that the wrong will be repeated"); Commodity
Futures Trading Com''n v. Hunt, 591 F.2d 1211, 1220 (7th Cir.1979) (same under
Commodity Exchange Act, 7 U.S.C. § 13a-1(b)); S.E.C. v. Bausch & Lomb Inc., 565
F.2d 8, 18 (2d Cir.1977) (reasonable likelihood of future violations required
under § 21(d) of Securities Exchange Act of 1934, 15 U.S.C. § 78u(d), which permits
an injunction "upon a proper showing" where person "engaged or ... about to engage
in" violation of statute).
[272] See, e.g., Rondeau v. Mosinee Paper Corp., 422 U.S. 49, 57, 95 S.Ct. 2069,
45 L.Ed.2d 12 (1975) (injunctive relief in private action under § 13(d) of the
Securities Exchange Act of 1934, 15 U.S.C. § 78m(d), as added by the Williams
Act, requires a showing of irreparable harm and inadequacy of legal remedies).
[273] Tough Traveler, Ltd. v. Outbound Prods., 60 F.3d 964, 967-68 (2d Cir.1995)
(trademark); Fisher-Price, Inc. v. Well-Made Toy Mfg. Corp., 25 F.3d 119, 124
(2d Cir.1994) (copyright).
[274] See, e.g., Northwestern Nat''l Ins. Co.'
- 'Indeed, were we to accept Maine’s argument, our decision in Espinoza would be
rendered essentially meaningless. By Maine’s logic, Montana could have obtained
the same result that we held violated the First Amendment simply by redefining
its tax credit for sponsors of generally available scholarships as limited to
“tuition payments for the rough equivalent of a Montana public education”—meaning
a secular education. But our holding in Espinoza turned on the substance of free
exercise protections, not on the presence or absence of magic words. That holding
applies fully whether the prohibited discrimination is in an express provision
like § 2951(2) or in a party’s reconceptualization of the public benefit.
Maine may provide a strictly secular education in its public schools. But BCS
and Temple Academy—like numerous other recipients of Maine tuition assistance
payments—are not public schools. In order to provide an education to children
who live in certain parts of its far-flung State, Maine has decided not to operate
schools of its own, but instead to offer tuition assistance that parents may direct
to the public or private schools of their choice. Maine’s administration of that
benefit is subject to the free exercise principles governing any such public benefit
program—including the prohibition on denying the benefit based on a recipient’s
religious exercise.
The dissents are wrong to say that under our decision today Maine “must” fund
religious education. Post, at 2006 (BREYER, J., dissenting). Maine chose to allow
some parents to direct state tuition payments to private schools; that decision
was not “forced upon” it. Post, at 2014 (SOTOMAYOR, J., dissenting). The State
retains a number of options: it could expand the reach of its public school system,
increase the availability of transportation, provide some combination of tutoring,
remote learning, and partial attendance, or even operate boarding schools of its
own. As we held in Espinoza, a “State need not subsidize private education. But
once a State decides to do so, it cannot disqualify some private schools solely
because they are religious.” 591 U. S., at ––––, 140 S.Ct., at 2261.
B
The Court of Appeals also attempted to distinguish this case from Trinity Lutheran
and Espinoza on the ground that the funding restrictions in those cases were “solely
status-based religious discrimination,” while the challenged provision here “imposes
a use-based restriction.” 979 F.3d at 35, 37–38...
In Trinity Lutheran, the Missouri Constitution banned the use of public funds
in aid of “any church, sect or denomination of religion.” [citation omitted].
We noted that the case involved “express discrimination based on religious identity,”
which was sufficient unto the day in deciding it, and that our opinion did “not
address religious uses of funding.” [citation omitted]
So too in Espinoza, the discrimination at issue was described by the Montana Supreme
Court as a prohibition on aiding “schools controlled by churches,” and we *2001
analyzed the issue in terms of “religious status and not religious use.” [citation
omitted] Foreshadowing Maine’s argument here, Montana argued that its case was
different from Trinity Lutheran’s because it involved not playground resurfacing,
but general funds that “could be used for religious ends by some recipients, particularly
schools that believe faith should ‘permeate[ ]’ everything they do.” [citation
omitted] We explained, however, that the strict scrutiny triggered by status-based
discrimination could not be avoided by arguing that “one of its goals or effects
[was] preventing religious organizations from putting aid to religious uses.”
[citation omitted] And we noted that nothing in our analysis was “meant to suggest
that we agree[d] with [Montana] that some lesser degree of scrutiny applies to
discrimination against religious uses of government aid.” [citation omitted]
Maine’s argument, however—along with the decision below and Justice BREYER’s dissent—is
premised on precisely such a distinction. [citations omitted]
That premise, however, misreads our precedents. In Trinity Lutheran and Espinoza,
we held that the Free Exercise Clause forbids discrimination on the basis of religious
status. But those decisions never suggested that use-based discrimination is any
less offensive to the Free Exercise Clause. This case illustrates why.'
- '429
Supreme Court of the United States.
SAMUEL M. CLYATT
v.
UNITED STATES.
No. 235.
|
Argued December 13, 14, 1904.
|
Decided March 13, 1905.
Synopsis
ON WRIT of Certiorari to the United States Circuit Court of Appeals for the Fifth
Circuit, bringing up for review a judgment of the Circuit Court for the Northern
District of Florida, convicting defendant of returning certain specified persons
to a condition of peonage, which judgment had been taken to the Circuit Court
of Appeals by a writ of error to the Circuit Court. Reversed and the cause remanded
for a new trial.
**429 Statement by Mr. Justice Brewer:
Considers the constitutionality of Sections 1990 and 5526, Rev. Stat. (U. S. Comp.
Stat. 1901, pp. 1266, 3715), [Anti-Peonage Act]
*215 Mr. Justice Brewer delivered the opinion of the court:
…What is peonage? It may be defined as a status or condition of compulsory service,
based upon the indebtedness of the peon to the master. The basal fact is indebtedness.
As said by Judge Benedict, delivering the opinion in Jaremillo v. Romero, 1 N.
M. 190, 194: ‘One fact existed universally: all were indebted to their masters.
This was the cord by which they seemed bound to their master’s service.’ Upon
this is based a condition of compulsory service. Peonage is sometimes classified
as voluntary or involuntary; but this implies simply a difference in the mode
of origin, but none in the character of the servitude. The one exists where the
debtor voluntarily contracts to enter the service of his creditor. The other is
forced upon the debtor by some provision of law. But peonage, however created,
is compulsory service,—involuntary servitude. The peon can release himself therefrom,
it is true, by the payment of the debt, but otherwise the service is enforced.
A clear distinction exists between peonage and the voluntary performance of labor
or rendering of services in payment of a debt. In the latter case the debtor,
though contracting to pay his indebtedness by labor or service, and subject, like
any other contractor, to an action for damages for breach of that contract, can
elect at any time to break it, and no law or force compels *216 performance or
a continuance of the service. We need not stop to consider any possible limits
or exceptional cases, such as the service of a sailor…or the obligations of a
child to its parents, or of an apprentice to his master, or the power of the legislature
to make unlawful, and punish criminally, an abandonment by an employee of his
post of labor in any extreme cases. That which is contemplated by the statute
is compulsory service to secure the payment of a debt. Is this legislation within
the power of Congress? It may be conceded, as a general proposition, that the
ordinary relations of individual to individual are subject to the control of the
states, and are not intrusted to the general government; but the 13th Amendment,
adopted as an outcome of the Civil War, reads:
‘Sec. 1. Neither slavery nor involuntary servitude, except as a punishment for
crime whereof the party shall have been duly convicted, shall exist within the
United States, or any place subject to their jurisdiction.
‘Sec. 2. Congress shall have power to enforce this article by appropriate legislation.’
This amendment denounces a status or condition, irrespective of the manner or
authority by which it is created. The prohibitions of the 14th and 15th Amendments
are largely upon the acts of the states; but the 13th Amendment names no party
or authority, but simply forbids slavery and involuntary servitude, grants to
Congress power to enforce this prohibition by appropriate legislation. The differences
between the 13th and subsequent amendments [can be described as follows:]
This amendment, as well as the 14th, is undoubtedly self-executing without any
ancillary legislation, so far as its terms are applicable to any existing state
of circumstances. By its own unaided force and effect it abolished slavery, and
*217 established universal freedom. Still, legislation may be necessary and proper
to meet all the various cases and circumstances to be affected by it, and to prescribe
proper modes of redress for its violation in letter or spirit. And such legislation
may be primary and direct in its character; for the amendment is not a mere prohibition
of state laws establishing or upholding slavery, but an absolute declaration that
slavery or involuntary servitude shall not exist in any part of the United States.
. . .'
- source_sentence: How does the standard for applying the Second Amendment, as outlined
in the context, compare to the protection of other constitutional rights, such
as the freedom of speech in the First Amendment?
sentences:
- 'Eventually, HCC moved to dismiss the complaint. The District Court granted the
motion, concluding that Mr. Wilson lacked standing under Article III. On appeal,
a panel of the Fifth Circuit reversed, holding that Mr. Wilson had standing and
that his complaint stated a viable First Amendment claim. [citation omitted]
The Fifth Circuit’s merits analysis proceeded in two steps. First, the court concluded
that a verbal “reprimand against an elected official for speech addressing a matter
of public concern is an actionable First Amendment claim under § 1983.” [citation
omitted] Next, the court reasoned that the Board’s imposition of other punishments—such
as limiting Mr. Wilson’s eligibility for officer positions and his access to certain
funds—did “not violate his First Amendment rights” because Mr. Wilson did not
have an “entitlement” to those privileges. [citation omitted] In sum, the court
held that Mr. Wilson’s § 1983 action could proceed, but only as to the Board’s
unadorned censure resolution. HCC’s request for rehearing en banc failed by an
equally divided vote. [citation omitted].
In time, HCC filed a petition for certiorari in this Court. It asked us to review
the Fifth Circuit’s judgment that Mr. Wilson may pursue a First Amendment claim
based on a purely verbal censure. Last year, we agreed to take up that question.
[citation omitted] But as merits briefing unfolded, Mr. Wilson did not just seek
to defend the Fifth Circuit’s judgment; he also sought to challenge it in part.
Specifically, he argued that the Fifth Circuit erred to the extent that it upheld
the Board’s nonverbal punishments as consistent with the First Amendment. Generally,
however, when a respondent in this Court seeks to alter a lower court’s judgment,
he must file and we must grant a cross-petition for review. [citation omitted]
Mr. Wilson filed no such petition in this case. As a result, we decline to take
up his *1259 challenge to the Fifth Circuit’s judgment, and the only question
before us remains the narrow one on which we granted certiorari: Does Mr. Wilson
possess an actionable First Amendment claim arising from the Board’s purely verbal
censure?
II
A
The First Amendment prohibits laws “abridging the freedom of speech.” One obvious
implication of that rule is that the government usually may not impose prior restraints
on speech. [citation omitted] But other implications follow too. Relevant here,
no one before us questions that, “[a]s a general matter,” the First Amendment
prohibits government officials from subjecting individuals to “retaliatory actions”
after the fact for having engaged in protected speech. [citations omitted] Mr.
Wilson argues that the Board’s censure resolution represents exactly that kind
of impermissible retaliatory action.
Almost immediately, however, this submission confronts a challenge. When faced
with a dispute about the Constitution’s meaning or application, “[l]ong settled
and established practice is a consideration of great weight.” [citation omitted]
Often, “a regular course of practice” can illuminate or “liquidate” our founding
document’s “terms & phrases.” [citations omitted] That principle poses a problem
for Mr. Wilson because elected bodies in this country have long exercised the
power to censure their members. In fact, no one before us has cited any evidence
suggesting that a purely verbal censure analogous to Mr. Wilson’s has ever been
widely considered offensive to the First Amendment.
As early as colonial times, the power of assemblies in this country to censure
their members was “more or less assumed.” [citation omitted] It seems, too, that
assemblies often exercised the power to censure members for views they expressed
and actions they took “both within and without the legislature.” [citations omitted]
The parties supply little reason to think the First Amendment was designed or
commonly understood to upend this practice…
If anything, censures [of public officials] have proven more common yet at the
state and local level…According to HCC and undisputed by Mr. Wilson, it seems
elected bodies in this country issued no fewer than 20 censures in August 2020
alone. [citation omitted]
If this longstanding practice does not “put at rest” the question of the Constitution’s
meaning for the dispute before us, it surely leaves a “considerable impression.”
[citation omitted] On Mr. Wilson’s telling and under the Fifth Circuit’s holding,
a purely verbal censure by an elected assembly of one of its own members may offend
the First Amendment.'
- '[citation omitted]
We assessed the lawfulness of that handgun ban by scrutinizing whether it comported
with history and tradition. Although we noted that the ban “would fail constitutional
muster” “[u]nder any of the standards of scrutiny that we have applied to enumerated
constitutional rights,”…we did not engage in means-end scrutiny when resolving
the constitutional question. Instead, we focused on the historically unprecedented
nature of the District’s ban, observing that “[f]ew laws in the history of our
Nation have come close to [that] severe restriction.” [citation omitted] Likewise,
when one of the dissents attempted to justify the District’s prohibition with
“founding-era historical precedent,” including “various restrictive laws in the
colonial period,” we addressed each purported analogue and concluded that they
were either irrelevant or “d[id] not remotely burden the right of self-defense
as much as an absolute ban on handguns.” [citations omitted] Thus, our earlier
historical analysis sufficed to show that the Second Amendment did not countenance
a “complete prohibition” on the use of “the most popular weapon chosen by Americans
for self-defense in the home.” [citation omitted]
2
As the foregoing shows, Heller’s methodology centered on constitutional text and
*2129 history. Whether it came to defining the character of the right (individual
or militia dependent), suggesting the outer limits of the right, or assessing
the constitutionality of a particular regulation, Heller relied on text and history.
It did not invoke any means-end test such as strict or intermediate scrutiny.
Moreover, Heller and McDonald expressly rejected the application of any “judge-empowering
‘interest-balancing inquiry’ that ‘asks whether the statute burdens a protected
interest in a way or to an extent that is out of proportion to the statute’s salutary
effects upon other important governmental interests.’ ” [citations omitted] We
declined to engage in means-end scrutiny because “[t]he very enumeration of the
right takes out of the hands of government—even the Third Branch of Government—the
power to decide on a case-by-case basis whether the right is really worth insisting
upon.” [citation omitted] We then concluded: “A constitutional guarantee subject
to future judges’ assessments of its usefulness is no constitutional guarantee
at all.” [citation omitted]
Not only did Heller decline to engage in means-end scrutiny generally, but it
also specifically ruled out the intermediate-scrutiny test that respondents and
the United States now urge us to adopt. Dissenting in Heller, Justice BREYER’s
proposed standard—“ask[ing] whether [a] statute burdens a protected interest in
a way or to an extent that is out of proportion to the statute’s salutary effects
upon other important governmental interests,” …—simply expressed a classic formulation
of intermediate scrutiny in a slightly different way. [ci8tations omitted] In
fact, Justice BREYER all but admitted that his Heller dissent advocated for intermediate
scrutiny by repeatedly invoking a quintessential intermediate-scrutiny precedent.
[citations omitted]Thus, when Heller expressly rejected that dissent’s “interest-balancing
inquiry,” [citation omitted] it necessarily rejected intermediate scrutiny.5
In sum, the Courts of Appeals’ second step is inconsistent with Heller’s historical
approach and its rejection of means-end scrutiny. We reiterate that the standard
for applying the Second Amendment is as follows: When the Second Amendment’s plain
text covers an individual’s *2130 conduct, the Constitution presumptively protects
that conduct. The government must then justify its regulation by demonstrating
that it is consistent with the Nation’s historical tradition of firearm regulation.
Only then may a court conclude that the individual’s conduct falls outside the
Second Amendment’s “unqualified command.” [citation omitted]
C
This Second Amendment standard accords with how we protect other constitutional
rights. [One example is] the freedom of speech in the First Amendment, to which
Heller repeatedly compared the right to keep and bear arms. [citation omitted]
In that context, “[w]hen the Government restricts speech, the Government bears
the burden of proving the constitutionality of its actions.” [citations omitted]
In some cases, that burden includes showing whether the expressive conduct falls
outside of the category of protected speech. [citation omitted] And to carry that
burden, the government must generally point to historical evidence about the reach
of the First Amendment’s protections.'
- 'Roe and Casey thought that one-sided view misguided. In some sense, that is the
difference in a nutshell between our precedents and the majority opinion. The
constitutional regime we have lived in for the last 50 years recognized competing
interests, and sought a balance between them. The constitutional regime we enter
today erases the woman’s interest and recognizes only the State’s (or the Federal
Government’s).
B
The majority makes this change based on a single question: Did the reproductive
right recognized in Roe and Casey exist in “1868, the year when the Fourteenth
Amendment was ratified”? Ante, at 2252 – 2253. The majority says (and with this
much we agree) that the answer to this question is no: In 1868, there was no nationwide
right to end a pregnancy, and no thought that the Fourteenth Amendment provided
one.
Of course, the majority opinion refers as well to some later and earlier history.
On the one side of 1868, it goes back as far as the 13th (the 13th!) century.
See ante, at 2249, 142 S.Ct. 2111. But that turns out to be wheel-spinning. First,
it is not clear what relevance *2324 such early history should have, even to the
majority. See New York State Rifle & Pistol Assn., Inc. v.Bruen, 597 U.S. ––––,
––––, 142 S.Ct. 2111, 2136, ––– L.Ed.2d –––– (2022) (“Historical evidence that
long predates [ratification] may not illuminate the scope of the right”). If the
early history obviously supported abortion rights, the majority would no doubt
say that only the views of the Fourteenth Amendment’s ratifiers are germane. See
ibid. (It is “better not to go too far back into antiquity,” except if olden “law
survived to become our Founders’ law”). Second—and embarrassingly for the majority—early
law in fact does provide some support for abortion rights. Common-law authorities
did not treat abortion as a crime before “quickening”—the point when the fetus
moved in the womb.2 And early American law followed the common-law rule.3 So the
criminal law of that early time might be taken as roughly consonant with Roe’s
and Casey’s different treatment of early and late abortions. Better, then, to
move forward in time. On the other side of 1868, the majority occasionally notes
that many States barred abortion up to the time of Roe. See ante, at 2253, 2260,
142 S.Ct. 2111. That is convenient for the majority, but it is window dressing.
As the same majority (plus one) just informed us, “post-ratification adoption
or acceptance of laws that are inconsistent with the original meaning of the constitutional
text obviously cannot overcome or alter that text.” New York State Rifle & Pistol
Assn., Inc., 597 U.S., at –––– – ––––, 142 S.Ct., at 2137. Had the pre-Roe liberalization
of abortion laws occurred more quickly and more widely in the 20th century, the
majority would say (once again) that only the ratifiers’ views are germane.
The majority’s core legal postulate, then, is that we in the 21st century must
read the Fourteenth Amendment just as its ratifiers did. And that is indeed what
the majority emphasizes over and over again. See ante, at 2267 (“[T]he most important
historical fact [is] how the States regulated abortion when the Fourteenth Amendment
was adopted”); see also ante, at 2242 – 2243, 2248 – 2249, and n. 24, 23, 25,
28. If the ratifiers did not understand something as central to freedom, then
neither can we. Or said more particularly: If those people did not understand
reproductive rights as part of the guarantee of liberty conferred in the Fourteenth
Amendment, then those rights do not exist.
As an initial matter, note a mistake in the just preceding sentence. We referred
there to the “people” who ratified the Fourteenth Amendment: What rights did those
“people” have in their heads at the time? But, of course, “people” did not ratify
the Fourteenth Amendment. Men did. So it is perhaps not so surprising that the
ratifiers were not perfectly attuned to the importance of reproductive rights
for women’s liberty, or for their capacity to participate as equal members of
our Nation.'
- source_sentence: Based on the court's ruling, what are the implications of Title
VII regarding discrimination against employees based on their transgender status
or failure to conform to sex stereotypes?
sentences:
- 'Thus, even if we agreed with the Funeral Home that Rost''s religious exercise
would be substantially burdened by enforcing Title VII in this case, we would
nevertheless REVERSE the district court''s grant of summary judgment to the Funeral
Home and hold instead that requiring the Funeral Home to comply with Title VII
constitutes the least restrictive means of furthering the government''s compelling
interest in eradicating discrimination against Stephens on the basis of sex. Thus,
even assuming Rost''s religious exercise is substantially burdened by the EEOC''s
enforcement action in this case, we GRANT summary judgment to the EEOC on the
Funeral Home''s RFRA defense on this alternative ground.
[ … ]
[ … ]
III. CONCLUSION
Discrimination against employees, either because of their failure to conform to
sex stereotypes or their transgender and transitioning status, is illegal under
Title VII. The unrefuted facts show that the Funeral Home fired Stephens because
she refused to abide by her employer''s stereotypical conception of her sex, and
therefore the EEOC is entitled to summary judgment as to its unlawful-termination
claim. RFRA provides the Funeral Home with no relief because continuing to employ
Stephens would not, as a matter of law, substantially burden Rost''s religious
exercise, and even if it did, the EEOC has shown that enforcing Title VII here
is the least restrictive means of furthering its compelling interest in combating
and eradicating sex discrimination. We therefore REVERSE the district court''s
grant of summary judgment in favor of the Funeral Home and GRANT summary judgment
to the EEOC on its unlawful-termination claim. We also REVERSE the district court''s
grant of summary judgment on the EEOC''s discriminatory-clothing-allowance claim,
as the district court erred in failing to consider the EEOC''s claim on the merits.
We REMAND this case to the district court for further proceedings consistent with
this opinion.
[1] We refer to Stephens using female pronouns, in accordance with the preference
she has expressed through her briefing to this court.
[2] All facts drawn from Def.''s Statement of Facts (R. 55) are undisputed. See R.
64 (Pl.''s Counter Statement of Disputed Facts) (Page ID #2066-88).
[3] See also Appellee Br. at 16 ("It is a helpful exercise to think about Price
Waterhouse and imagine that there was a dress code imposed which obligated Ms.
Hopkins to wear a skirt while her male colleagues were obliged to wear pants.
Had she simply been fired for wearing pants rather than a skirt, the case would
have ended there — both sexes would have been equally burdened by the requirement
to comply with their respective sex-specific standard. But what the firm could
not do was fire her for being aggressive or macho when it was tolerating or rewarding
the behavior among men — and when it did, it relied on a stereotype to treat her
disparately from the men in the firm.").
[4] Moreover, discrimination because of a person''s transgender, intersex, or
sexually indeterminate status is no less actionable than discrimination because
of a person''s identification with two religions, an unorthodox religion, or no
religion at all. And "religious identity" can be just as fluid, variable, and
difficult to define as "gender identity"; after all, both have "a deeply personal,
internal genesis that lacks a fixed external referent." Sue Landsittel, Strange
Bedfellows? Sex, Religion, and Transgender Identity Under Title VII, 104 NW. U.
L. REV. 1147, 1172 (2010) (advocating for "[t]he application of tests for religious
identity to the problem of gender identity [because it] produces a more realistic,
and therefore more appropriate, authentication framework than the current reliance
on medical diagnoses and conformity with the gender binary").
[5] On the other hand, there is also evidence that Stephens was fired only because
of her nonconforming appearance and behavior at work, and not because of her transgender
identity. See R. 53-6 (Rost Dep.'
- 'Such laws would furnish the readiest means of compulsion. The 13th *244 Amendment
prohibits involuntary servitude except as punishment for crime. But the exception,
allowing full latitude for the enforcement of penal laws, does not destroy the
prohibition. It does not permit slavery or involuntary servitude to be established
or maintained through the operation of the criminal law by making it a crime to
refuse to submit to the one or to render the service which would constitute the
other. The state may impose involuntary servitude as a punishment for crime, but
it may not compel one man to labor for another in payment of a debt, by punishing
him as a criminal if he does not perform the service or pay the debt.
If the statute in this case had authorized the employing company to seize the
debtor, and hold him to the service until he paid the $15, or had furnished the
equivalent in labor, its invalidity would not be questioned. It would be equally
clear that the state could not authorize its constabulary to prevent the servant
from escaping, and to force him to work out his debt. But the state could not
avail itself of the sanction of the criminal law to supply the compulsion any
more than it could use or authorize the use of physical force. ‘In contemplation
of the law, the compulsion to such service by the fear of punishment under a criminal
statute is more powerful than any guard which the employer could station.’ Ex
parte Hollman, 79 S. C. 22, 21 L.R.A.(N.S.) 249, 60 S. E. p. 24, 14 A. & E. Ann.
Cas. 1109.
**153 What the state may not do directly it may not do indirectly. If it cannot
punish the servant as a criminal for the mere failure or refusal to serve without
paying his debt, it is not permitted to accomplish the same result by creating
a statutory presumption which, upon proof of no other fact, exposes him to conviction
and punishment. Without imputing any actual motive to oppress, we must consider
the natural operation of the statute here in question (Henderson v. New York [Henderson
v. Wickham] 92 U. S. p. 268, 23 L. ed. 547), and it is apparent that it furnishes
a convenient instrument for the coercion *245 which the Constitution and the act
of Congress forbid; an instrument of compulsion peculiarly effective as against
the poor and the ignorant, its most likely victims. There is no more important
concern than to safeguard the freedom of labor upon which alone can enduring prosperity
be based. The provision designed to secure it would soon become a barren form
if it were possible to establish a statutory presumption of this sort, and to
hold over the heads of laborers the threat of punishment for crime, under the
name of fraud, but merely upon evidence of failure to work out their debts. The
act of Congress deprives of effect all legislative measures of any state through
which, directly or indirectly, the prohibited thing, to wit, compulsory service
to secure the payment of a debt, may be established or maintained; and we conclude
that § 4730, as amended, of the Code of Alabama, in so far as it makes the refusal
or failure to perform the act or service, without refunding the money or paying
for the property prima facie evidence of the commission received of the crime
which the section defines, is in conflict with the 13th Amendment, and the legislation
authorized by that Amendment, and is therefore invalid.
In this view it is unnecessary to consider the contentions which have been made
under the 14th Amendment…
Reversed and cause remanded for further proceedings not inconsistent with this
opinion.
Mr. Justice Holmes, dissenting [omitted]
2.3
Jones v. Alfred H. Mayer Co.
88 S.Ct. 2186
Supreme Court of the United States
Joseph Lee JONES et ux., Petitioners,
v.
ALFRED H. MAYER CO. et al.
No. 645.
|
Argued April 1 and 2, 1968.
|
Decided June 17, 1968.
Synopsis
Action to recover damages and for injunctive relief because of refusal of defendants
to sell home in private subdivision to plaintiffs solely because of race. The
United States District Court for the Eastern District of Missouri, 255 F.Supp.
115, dismissed complaint, and plaintiffs appealed. The Court of Appeals for the
Eighth Circuit, 379 F.2d 33, affirmed, and certiorari was granted. The United
States Supreme Court, Mr.'
- '[citation omitted]
*1994 The program imposes no geographic limitation: Parents may direct tuition
payments to schools inside or outside the State, or even in foreign countries.
[citation omitted] In schools that qualify for the program because they are accredited,
teachers need not be certified by the State,…and Maine’s curricular requirements
do not apply…Single-sex schools are eligible. [citation omitted]
Prior to 1981, parents could also direct the tuition assistance payments to religious
schools. Indeed, in the 1979–1980 school year, over 200 Maine students opted to
attend such schools through the tuition assistance program. App. 72. In 1981,
however, Maine imposed a new requirement that any school receiving tuition assistance
payments must be “a nonsectarian school in accordance with the First Amendment
of the United States Constitution.” [citation omitted] That provision was enacted
in response to an opinion by the Maine attorney general taking the position that
public funding of private religious schools violated the Establishment Clause
of the First Amendment. We subsequently held, however, that a benefit program
under which private citizens “direct government aid to religious schools wholly
as a result of their own genuine and independent private choice” does not offend
the Establishment Clause. [citation omitted] Following our decision in Zelman,
the Maine Legislature considered a proposed bill to repeal the “nonsectarian”
requirement, but rejected it. App. 100, 108.
The “nonsectarian” requirement for participation in Maine’s tuition assistance
program remains in effect today. The Department has stated that, in administering
this requirement, it “considers a sectarian school to be one that is associated
with a particular faith or belief system and which, in addition to teaching academic
subjects, promotes the faith or belief system with which it is associated and/or
presents the material taught through the lens of this faith.” [citation omitted]
“The Department’s focus is on what the school teaches through its curriculum and
related activities, and how the material is presented.” …“[A]ffiliation or association
with a church or religious institution is one potential indicator of a sectarian
school,” but “it is not dispositive.”
B
This case concerns two families that live in SAUs that neither maintain their
own secondary schools nor contract with any nearby secondary school. App. 70,
71. Petitioners David and Amy Carson reside in Glenburn, Maine. Id., at 74. When
this litigation commenced, the Carsons’ daughter attended high school at Bangor
Christian Schools (BCS), which was founded in 1970 as a ministry of Bangor Baptist
Church. Id., at 74, 80. The Carsons sent their daughter to BCS because of the
school’s high academic standards and because the school’s Christian worldview
aligns with their sincerely held religious beliefs. Id., at 74. Given that BCS
is a “sectarian” school that cannot qualify for tuition assistance payments under
Maine’s program, id., at 80, the Carsons paid the tuition for their daughter to
attend BCS themselves, id., at 74.
Petitioners Troy and Angela Nelson live in Palermo, Maine. Id., at 78. When this
litigation commenced, the Nelsons’ daughter attended high school at Erskine Academy,
a secular private school, and their son attended middle school at Temple Academy,
a “sectarian” school affiliated with *1995 Centerpoint Community Church. Id.,
at 78, 90, 91. The Nelsons sent their son to Temple Academy because they believed
it offered him a high-quality education that aligned with their sincerely held
religious beliefs. Id., at 78. While they wished to send their daughter to Temple
Academy too, they could not afford to pay the cost of the Academy’s tuition for
both of their children. Id., at 79.
BCS and Temple Academy are both accredited by the New England Association of Schools
and Colleges (NEASC), and the Department considers each school a “private school
approved for attendance purposes” under the State’s compulsory attendance requirement.
Id., at 80, 90. Yet because neither school qualifies as “nonsectarian,” neither
is eligible to receive tuition payments under Maine’s tuition assistance program.
Id., at 80, 90. Absent the “nonsectarian” requirement, the Carsons and the Nelsons
would have asked their respective SAUs to pay the tuition to send their children
to BCS and Temple Academy, respectively. Id., at 79.
In 2018, petitioners brought suit against the commissioner of the Maine Department
of Education. Id., at 11–12.'
model-index:
- name: ModernBERT Embed base LegalTextAI Matryoshka
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 768
type: dim_768
metrics:
- type: cosine_accuracy@1
value: 0.4838709677419355
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.6989247311827957
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.7956989247311828
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.9247311827956989
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.4838709677419355
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.37992831541218625
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.2838709677419354
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.17204301075268813
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.21774193548387094
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.4883512544802867
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.5882616487455197
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7087813620071685
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5864023588218451
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.5962578938385393
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.49158210371757605
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 512
type: dim_512
metrics:
- type: cosine_accuracy@1
value: 0.4838709677419355
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.7204301075268817
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.7849462365591398
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.9032258064516129
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.4838709677419355
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.3870967741935483
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.286021505376344
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.1677419354838709
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.22311827956989244
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.5026881720430108
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.5936379928315412
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.6944444444444444
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5845266760205443
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.5949906127325485
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.4986982754839258
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 256
type: dim_256
metrics:
- type: cosine_accuracy@1
value: 0.45161290322580644
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.6881720430107527
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.7956989247311828
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8817204301075269
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.45161290322580644
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.36559139784946226
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.27956989247311825
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.16559139784946234
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.20878136200716843
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.471774193548387
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.5806451612903226
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.6854838709677419
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5650385704476973
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.5673792456050522
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.47608804104449853
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 128
type: dim_128
metrics:
- type: cosine_accuracy@1
value: 0.44086021505376344
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.6451612903225806
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.7634408602150538
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8387096774193549
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.44086021505376344
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.3548387096774194
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.27311827956989243
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.15591397849462363
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.1872759856630824
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.44534050179211476
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.5725806451612904
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.654121863799283
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5356361930824536
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.5453490356716165
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.45106439048323554
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 64
type: dim_64
metrics:
- type: cosine_accuracy@1
value: 0.3978494623655914
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.6021505376344086
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.7096774193548387
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8064516129032258
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.3978494623655914
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.34050179211469533
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.26021505376344084
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.153763440860215
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.1586021505376344
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.4059139784946236
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.5259856630824372
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.6164874551971326
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5019311887697538
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.5081626557433011
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.4181782323905875
name: Cosine Map@100
---
# ModernBERT Embed base LegalTextAI Matryoshka
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [nomic-ai/modernbert-embed-base](https://huggingface.co/nomic-ai/modernbert-embed-base) on the json dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [nomic-ai/modernbert-embed-base](https://huggingface.co/nomic-ai/modernbert-embed-base) <!-- at revision d556a88e332558790b210f7bdbe87da2fa94a8d8 -->
- **Maximum Sequence Length:** 8192 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- json
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: ModernBertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("legaltextai/modernbert-embed-ft-const-legal-matryoshka")
# Run inference
sentences = [
"Based on the court's ruling, what are the implications of Title VII regarding discrimination against employees based on their transgender status or failure to conform to sex stereotypes?",
'Thus, even if we\xa0agreed with the Funeral Home that Rost\'s religious exercise would be substantially burdened by enforcing Title VII in this case, we would nevertheless REVERSE the district court\'s grant of summary judgment to the Funeral Home and hold instead that requiring the Funeral Home to comply with Title VII constitutes the least restrictive means of furthering the government\'s compelling interest in eradicating discrimination against Stephens on the basis of sex. Thus, even assuming Rost\'s religious exercise is substantially burdened by the EEOC\'s enforcement action in this case, we GRANT summary judgment to the EEOC on the Funeral Home\'s RFRA defense on this alternative ground.\n\n\xa0\n\n[ … ]\n\n[ … ]\n\n\xa0\n\nIII. CONCLUSION\n\nDiscrimination against employees, either because of their failure to conform to sex stereotypes or their transgender and transitioning status, is illegal under Title VII. The unrefuted facts show that the Funeral Home fired Stephens because she refused to abide by her employer\'s stereotypical conception of her sex, and therefore the EEOC is entitled to summary judgment as to its unlawful-termination claim. RFRA provides the Funeral Home with no relief because continuing to employ Stephens would not, as a matter of law, substantially burden Rost\'s religious exercise, and even if it did, the EEOC has shown that enforcing Title VII here is the least restrictive means of furthering its compelling interest in combating and eradicating sex discrimination. We therefore REVERSE the district court\'s grant of summary judgment in favor of the Funeral Home and GRANT summary judgment to the EEOC on its unlawful-termination claim. We also REVERSE the district court\'s grant of summary judgment on the EEOC\'s discriminatory-clothing-allowance claim, as the district court erred in failing to consider the EEOC\'s claim on the merits. We REMAND this case to the district court for further proceedings consistent with this opinion.\n\n[1]\xa0We refer to Stephens using female pronouns, in accordance with the preference she has expressed through her briefing to this court.\n\n[2]\xa0All facts drawn from Def.\'s Statement of Facts (R. 55) are undisputed.\xa0See\xa0R. 64 (Pl.\'s Counter Statement of Disputed Facts) (Page ID #2066-88).\n\n[3]\xa0See also\xa0Appellee Br. at 16 ("It is a helpful exercise to think about\xa0Price Waterhouse\xa0and imagine that there was a dress code imposed which obligated Ms. Hopkins to wear a skirt while her male colleagues were obliged to wear pants. Had she simply been fired for wearing pants rather than a skirt, the case would have ended there — both sexes would have been equally burdened by the requirement to comply with their respective sex-specific standard. But what the firm could not do was fire her for being aggressive or macho when it was tolerating or rewarding the behavior among men — and when it did, it relied on a stereotype to treat her disparately from the men in the firm.").\n\n[4]\xa0Moreover, discrimination because of a person\'s transgender, intersex, or sexually indeterminate status is no less actionable than discrimination because of a person\'s identification with two religions, an unorthodox religion, or no religion at all. And "religious identity" can be just as fluid, variable, and difficult to define as "gender identity"; after all, both have "a deeply personal, internal genesis that lacks a fixed external referent." Sue Landsittel,\xa0Strange Bedfellows? Sex, Religion, and Transgender Identity Under Title VII,\xa0104 NW. U. L. REV. 1147, 1172 (2010) (advocating for "[t]he application of tests for religious identity to the problem of gender identity [because it] produces a more realistic, and therefore more appropriate, authentication framework than the current reliance on medical diagnoses and conformity with the gender binary").\n\n[5]\xa0On the other hand, there is also evidence that Stephens was fired only because of her nonconforming appearance and behavior at work, and not because of her transgender identity.\xa0See\xa0R. 53-6 (Rost Dep.',
'[citation omitted]\n\n\xa0\n\n*1994 The program imposes no geographic limitation: Parents may direct tuition payments to schools inside or outside the State, or even in foreign countries. [citation omitted] In schools that qualify for the program because they are accredited, teachers need not be certified by the State,…and Maine’s curricular requirements do not apply…Single-sex schools are eligible. [citation omitted]\n\n\xa0\n\nPrior to 1981, parents could also direct the tuition assistance payments to religious schools. Indeed, in the 1979–1980 school year, over 200 Maine students opted to attend such schools through the tuition assistance program. App. 72. In 1981, however, Maine imposed a new requirement that any school receiving tuition assistance payments must be “a nonsectarian school in accordance with the First Amendment of the United States Constitution.” [citation omitted] That provision was enacted in response to an opinion by the Maine attorney general taking the position that public funding of private religious schools violated the Establishment Clause of the First Amendment. We subsequently held, however, that a benefit program under which private citizens “direct government aid to religious schools wholly as a result of their own genuine and independent private choice” does not offend the Establishment Clause. [citation omitted] Following our decision in Zelman, the Maine Legislature considered a proposed bill to repeal the “nonsectarian” requirement, but rejected it. App. 100, 108.\n\n\xa0\n\nThe “nonsectarian” requirement for participation in Maine’s tuition assistance program remains in effect today. The Department has stated that, in administering this requirement, it “considers a sectarian school to be one that is associated with a particular faith or belief system and which, in addition to teaching academic subjects, promotes the faith or belief system with which it is associated and/or presents the material taught through the lens of this faith.” [citation omitted] “The Department’s focus is on what the school teaches through its curriculum and related activities, and how the material is presented.” …“[A]ffiliation or association with a church or religious institution is one potential indicator of a sectarian school,” but “it is not dispositive.”\n\n\xa0\n\n\xa0\n\nB\n\nThis case concerns two families that live in SAUs that neither maintain their own secondary schools nor contract with any nearby secondary school. App. 70, 71. Petitioners David and Amy Carson reside in Glenburn, Maine. Id., at 74. When this litigation commenced, the Carsons’ daughter attended high school at Bangor Christian Schools (BCS), which was founded in 1970 as a ministry of Bangor Baptist Church. Id., at 74, 80. The Carsons sent their daughter to BCS because of the school’s high academic standards and because the school’s Christian worldview aligns with their sincerely held religious beliefs. Id., at 74. Given that BCS is a “sectarian” school that cannot qualify for tuition assistance payments under Maine’s program, id., at 80, the Carsons paid the tuition for their daughter to attend BCS themselves, id., at 74.\n\n\xa0\n\nPetitioners Troy and Angela Nelson live in Palermo, Maine. Id., at 78. When this litigation commenced, the Nelsons’ daughter attended high school at Erskine Academy, a secular private school, and their son attended middle school at Temple Academy, a “sectarian” school affiliated with *1995 Centerpoint Community Church. Id., at 78, 90, 91. The Nelsons sent their son to Temple Academy because they believed it offered him a high-quality education that aligned with their sincerely held religious beliefs. Id., at 78. While they wished to send their daughter to Temple Academy too, they could not afford to pay the cost of the Academy’s tuition for both of their children. Id., at 79.\n\n\xa0\n\nBCS and Temple Academy are both accredited by the New England Association of Schools and Colleges (NEASC), and the Department considers each school a “private school approved for attendance purposes” under the State’s compulsory attendance requirement. Id., at 80, 90. Yet because neither school qualifies as “nonsectarian,” neither is eligible to receive tuition payments under Maine’s tuition assistance program. Id., at 80, 90. Absent the “nonsectarian” requirement, the Carsons and the Nelsons would have asked their respective SAUs to pay the tuition to send their children to BCS and Temple Academy, respectively. Id., at 79.\n\n\xa0\n\nIn 2018, petitioners brought suit against the commissioner of the Maine Department of Education. Id., at 11–12.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Information Retrieval
* Datasets: `dim_768`, `dim_512`, `dim_256`, `dim_128` and `dim_64`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | dim_768 | dim_512 | dim_256 | dim_128 | dim_64 |
|:--------------------|:-----------|:-----------|:----------|:-----------|:-----------|
| cosine_accuracy@1 | 0.4839 | 0.4839 | 0.4516 | 0.4409 | 0.3978 |
| cosine_accuracy@3 | 0.6989 | 0.7204 | 0.6882 | 0.6452 | 0.6022 |
| cosine_accuracy@5 | 0.7957 | 0.7849 | 0.7957 | 0.7634 | 0.7097 |
| cosine_accuracy@10 | 0.9247 | 0.9032 | 0.8817 | 0.8387 | 0.8065 |
| cosine_precision@1 | 0.4839 | 0.4839 | 0.4516 | 0.4409 | 0.3978 |
| cosine_precision@3 | 0.3799 | 0.3871 | 0.3656 | 0.3548 | 0.3405 |
| cosine_precision@5 | 0.2839 | 0.286 | 0.2796 | 0.2731 | 0.2602 |
| cosine_precision@10 | 0.172 | 0.1677 | 0.1656 | 0.1559 | 0.1538 |
| cosine_recall@1 | 0.2177 | 0.2231 | 0.2088 | 0.1873 | 0.1586 |
| cosine_recall@3 | 0.4884 | 0.5027 | 0.4718 | 0.4453 | 0.4059 |
| cosine_recall@5 | 0.5883 | 0.5936 | 0.5806 | 0.5726 | 0.526 |
| cosine_recall@10 | 0.7088 | 0.6944 | 0.6855 | 0.6541 | 0.6165 |
| **cosine_ndcg@10** | **0.5864** | **0.5845** | **0.565** | **0.5356** | **0.5019** |
| cosine_mrr@10 | 0.5963 | 0.595 | 0.5674 | 0.5453 | 0.5082 |
| cosine_map@100 | 0.4916 | 0.4987 | 0.4761 | 0.4511 | 0.4182 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### json
* Dataset: json
* Size: 842 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 842 samples:
| | anchor | positive |
|:--------|:-----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 24 tokens</li><li>mean: 42.46 tokens</li><li>max: 68 tokens</li></ul> | <ul><li>min: 236 tokens</li><li>mean: 962.01 tokens</li><li>max: 1056 tokens</li></ul> |
* Samples:
| anchor | positive |
|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Based on the court's ruling, under what circumstances can a college student be held accountable for off-campus speech, and how does this relate to the standards of professionalism in a professional school setting?</code> | <code>A serious question raised by Keefe in this case is whether the First Amendment protected his unprofessional speech from academic disadvantage because it was made in- on-line, off-campus Facebook postings. On appeal, Keefe framed this contention categorically, arguing that a college student may not be punished for off-campus speech unless it is speech that is unprotected by the First Amendment, such as obscenity. We reject this categorical contention. A student may demonstrate an unacceptable lack of professionalism off campus, as well as in the classroom, and by speech as well as conduct. See Yoder v. Univ. of Louisville, 526 Fed-Appx. 537, 545-46 (6th Cir.), cert. denied, — U.S. -, 134 S.Ct. 790, 187 L.Ed.2d 594 (2013); Tatro v. Univ. of Minn., 816 N.W.2d 509, 521 (Minn. 2012). Therefore, college administrators and educators in a professional school have discretion to require compliance with recognized standards of the profession, both on and off campus, “so long as their actions are ...</code> |
| <code>Describe the two-step framework that Courts of Appeals have developed for analyzing Second Amendment challenges. What are the implications of the Supreme Court's decision to reject this framework in favor of a historical tradition-based approach?</code> | <code>Petitioners sued respondents for declaratory and injunctive relief under…42 U.S.C. § 1983, alleging that respondents violated their Second and Fourteenth Amendment rights by denying their unrestricted-license applications on the basis that they had failed to show “proper cause,” i.e., had failed to demonstrate a unique need for self-defense.<br><br> <br><br>The District Court dismissed petitioners’ complaint and the Court of Appeals affirmed. [citation omitted] Both courts relied on [a] Court of Appeals’ prior decision…which had sustained New York’s proper-cause standard, holding that the requirement was “substantially related to the achievement of an important governmental interest.” [citation omitted]<br><br> <br><br>We granted certiorari to decide whether New York’s denial of petitioners’ license applications violated the Constitution. [citation omitted]<br><br> <br><br> <br><br>II<br><br>In Heller and McDonald, we held that the Second and Fourteenth Amendments protect an individual right to keep and bear arms for self-defense. ...</code> |
| <code>Discuss the implications of the California Alien Land Law as it pertains to the rights of American citizens, specifically in the case of Fred Oyama. How does the law affect his privileges as a citizen, and what constitutional protections are being challenged?</code> | <code>269<br><br>Supreme Court of the United States<br><br>OYAMA et al.<br><br>v.<br><br>STATE OF CALIFORNIA.<br><br>No. 44.<br><br>|<br><br>Argued Oct. 22, 1947.<br><br>|<br><br>Decided Jan. 19, 1948.<br><br>Opinion<br><br>*635 Mr. Chief Justice VINSON delivered the opinion of the Court.<br><br>Petitioners challenge the constitutionality of California’s Alien Land Law1 as it has been applied in this case to effect an escheat of two small parcels of agricultural land.2 One of the petitioners is Fred Oyama, a minor American citizen in whose name title was taken. The other is his father and guardian, Kajiro Oyama, a Japanese citizen not eligible for naturalization,3 who paid the purchase price.<br><br>Petitioners press three attacks on the Alien Land Law as it has been applied in this case: first, that it deprives Fred Oyama of the equal protection of the laws and of his privileges as an American citizen; secondly, that it denies Kajiro Oyama equal protection of the laws; and, thirdly, that it contravenes the due process clause by sanctioning a taking of property after ...</code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `gradient_accumulation_steps`: 32
- `learning_rate`: 2e-05
- `num_train_epochs`: 4
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `bf16`: True
- `tf32`: True
- `load_best_model_at_end`: True
- `optim`: adamw_torch_fused
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: epoch
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 32
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 2e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 4
- `max_steps`: -1
- `lr_scheduler_type`: cosine
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: True
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: True
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: True
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch_fused
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | dim_768_cosine_ndcg@10 | dim_512_cosine_ndcg@10 | dim_256_cosine_ndcg@10 | dim_128_cosine_ndcg@10 | dim_64_cosine_ndcg@10 |
|:----------:|:-----:|:----------------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|
| 0.6038 | 1 | 0.5604 | 0.5631 | 0.5303 | 0.4907 | 0.4335 |
| 1.6038 | 2 | 0.5836 | 0.5758 | 0.5715 | 0.5180 | 0.4846 |
| 2.6038 | 3 | 0.5768 | 0.5841 | 0.5652 | 0.5296 | 0.4940 |
| **3.6038** | **4** | **0.5864** | **0.5845** | **0.565** | **0.5356** | **0.5019** |
* The bold row denotes the saved checkpoint.
### Framework Versions
- Python: 3.11.11
- Sentence Transformers: 3.4.1
- Transformers: 4.48.3
- PyTorch: 2.6.0+cu124
- Accelerate: 1.3.0
- Datasets: 3.3.0
- Tokenizers: 0.21.0
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MatryoshkaLoss
```bibtex
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | [
"TEXT_CLASSIFICATION"
]
| [
"BEAR",
"CAS"
]
| Non_BioNLP |
# ModernBERT Embed base LegalTextAI Matryoshka
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [nomic-ai/modernbert-embed-base](https://huggingface.co/nomic-ai/modernbert-embed-base) on the json dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [nomic-ai/modernbert-embed-base](https://huggingface.co/nomic-ai/modernbert-embed-base) <!-- at revision d556a88e332558790b210f7bdbe87da2fa94a8d8 -->
- **Maximum Sequence Length:** 8192 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- json
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: ModernBertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("legaltextai/modernbert-embed-ft-const-legal-matryoshka")
# Run inference
sentences = [
"Based on the court's ruling, what are the implications of Title VII regarding discrimination against employees based on their transgender status or failure to conform to sex stereotypes?",
'Thus, even if we\xa0agreed with the Funeral Home that Rost\'s religious exercise would be substantially burdened by enforcing Title VII in this case, we would nevertheless REVERSE the district court\'s grant of summary judgment to the Funeral Home and hold instead that requiring the Funeral Home to comply with Title VII constitutes the least restrictive means of furthering the government\'s compelling interest in eradicating discrimination against Stephens on the basis of sex. Thus, even assuming Rost\'s religious exercise is substantially burdened by the EEOC\'s enforcement action in this case, we GRANT summary judgment to the EEOC on the Funeral Home\'s RFRA defense on this alternative ground.\n\n\xa0\n\n[ … ]\n\n[ … ]\n\n\xa0\n\nIII. CONCLUSION\n\nDiscrimination against employees, either because of their failure to conform to sex stereotypes or their transgender and transitioning status, is illegal under Title VII. The unrefuted facts show that the Funeral Home fired Stephens because she refused to abide by her employer\'s stereotypical conception of her sex, and therefore the EEOC is entitled to summary judgment as to its unlawful-termination claim. RFRA provides the Funeral Home with no relief because continuing to employ Stephens would not, as a matter of law, substantially burden Rost\'s religious exercise, and even if it did, the EEOC has shown that enforcing Title VII here is the least restrictive means of furthering its compelling interest in combating and eradicating sex discrimination. We therefore REVERSE the district court\'s grant of summary judgment in favor of the Funeral Home and GRANT summary judgment to the EEOC on its unlawful-termination claim. We also REVERSE the district court\'s grant of summary judgment on the EEOC\'s discriminatory-clothing-allowance claim, as the district court erred in failing to consider the EEOC\'s claim on the merits. We REMAND this case to the district court for further proceedings consistent with this opinion.\n\n[1]\xa0We refer to Stephens using female pronouns, in accordance with the preference she has expressed through her briefing to this court.\n\n[2]\xa0All facts drawn from Def.\'s Statement of Facts (R. 55) are undisputed.\xa0See\xa0R. 64 (Pl.\'s Counter Statement of Disputed Facts) (Page ID #2066-88).\n\n[3]\xa0See also\xa0Appellee Br. at 16 ("It is a helpful exercise to think about\xa0Price Waterhouse\xa0and imagine that there was a dress code imposed which obligated Ms. Hopkins to wear a skirt while her male colleagues were obliged to wear pants. Had she simply been fired for wearing pants rather than a skirt, the case would have ended there — both sexes would have been equally burdened by the requirement to comply with their respective sex-specific standard. But what the firm could not do was fire her for being aggressive or macho when it was tolerating or rewarding the behavior among men — and when it did, it relied on a stereotype to treat her disparately from the men in the firm.").\n\n[4]\xa0Moreover, discrimination because of a person\'s transgender, intersex, or sexually indeterminate status is no less actionable than discrimination because of a person\'s identification with two religions, an unorthodox religion, or no religion at all. And "religious identity" can be just as fluid, variable, and difficult to define as "gender identity"; after all, both have "a deeply personal, internal genesis that lacks a fixed external referent." Sue Landsittel,\xa0Strange Bedfellows? Sex, Religion, and Transgender Identity Under Title VII,\xa0104 NW. U. L. REV. 1147, 1172 (2010) (advocating for "[t]he application of tests for religious identity to the problem of gender identity [because it] produces a more realistic, and therefore more appropriate, authentication framework than the current reliance on medical diagnoses and conformity with the gender binary").\n\n[5]\xa0On the other hand, there is also evidence that Stephens was fired only because of her nonconforming appearance and behavior at work, and not because of her transgender identity.\xa0See\xa0R. 53-6 (Rost Dep.',
'[citation omitted]\n\n\xa0\n\n*1994 The program imposes no geographic limitation: Parents may direct tuition payments to schools inside or outside the State, or even in foreign countries. [citation omitted] In schools that qualify for the program because they are accredited, teachers need not be certified by the State,…and Maine’s curricular requirements do not apply…Single-sex schools are eligible. [citation omitted]\n\n\xa0\n\nPrior to 1981, parents could also direct the tuition assistance payments to religious schools. Indeed, in the 1979–1980 school year, over 200 Maine students opted to attend such schools through the tuition assistance program. App. 72. In 1981, however, Maine imposed a new requirement that any school receiving tuition assistance payments must be “a nonsectarian school in accordance with the First Amendment of the United States Constitution.” [citation omitted] That provision was enacted in response to an opinion by the Maine attorney general taking the position that public funding of private religious schools violated the Establishment Clause of the First Amendment. We subsequently held, however, that a benefit program under which private citizens “direct government aid to religious schools wholly as a result of their own genuine and independent private choice” does not offend the Establishment Clause. [citation omitted] Following our decision in Zelman, the Maine Legislature considered a proposed bill to repeal the “nonsectarian” requirement, but rejected it. App. 100, 108.\n\n\xa0\n\nThe “nonsectarian” requirement for participation in Maine’s tuition assistance program remains in effect today. The Department has stated that, in administering this requirement, it “considers a sectarian school to be one that is associated with a particular faith or belief system and which, in addition to teaching academic subjects, promotes the faith or belief system with which it is associated and/or presents the material taught through the lens of this faith.” [citation omitted] “The Department’s focus is on what the school teaches through its curriculum and related activities, and how the material is presented.” …“[A]ffiliation or association with a church or religious institution is one potential indicator of a sectarian school,” but “it is not dispositive.”\n\n\xa0\n\n\xa0\n\nB\n\nThis case concerns two families that live in SAUs that neither maintain their own secondary schools nor contract with any nearby secondary school. App. 70, 71. Petitioners David and Amy Carson reside in Glenburn, Maine. Id., at 74. When this litigation commenced, the Carsons’ daughter attended high school at Bangor Christian Schools (BCS), which was founded in 1970 as a ministry of Bangor Baptist Church. Id., at 74, 80. The Carsons sent their daughter to BCS because of the school’s high academic standards and because the school’s Christian worldview aligns with their sincerely held religious beliefs. Id., at 74. Given that BCS is a “sectarian” school that cannot qualify for tuition assistance payments under Maine’s program, id., at 80, the Carsons paid the tuition for their daughter to attend BCS themselves, id., at 74.\n\n\xa0\n\nPetitioners Troy and Angela Nelson live in Palermo, Maine. Id., at 78. When this litigation commenced, the Nelsons’ daughter attended high school at Erskine Academy, a secular private school, and their son attended middle school at Temple Academy, a “sectarian” school affiliated with *1995 Centerpoint Community Church. Id., at 78, 90, 91. The Nelsons sent their son to Temple Academy because they believed it offered him a high-quality education that aligned with their sincerely held religious beliefs. Id., at 78. While they wished to send their daughter to Temple Academy too, they could not afford to pay the cost of the Academy’s tuition for both of their children. Id., at 79.\n\n\xa0\n\nBCS and Temple Academy are both accredited by the New England Association of Schools and Colleges (NEASC), and the Department considers each school a “private school approved for attendance purposes” under the State’s compulsory attendance requirement. Id., at 80, 90. Yet because neither school qualifies as “nonsectarian,” neither is eligible to receive tuition payments under Maine’s tuition assistance program. Id., at 80, 90. Absent the “nonsectarian” requirement, the Carsons and the Nelsons would have asked their respective SAUs to pay the tuition to send their children to BCS and Temple Academy, respectively. Id., at 79.\n\n\xa0\n\nIn 2018, petitioners brought suit against the commissioner of the Maine Department of Education. Id., at 11–12.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Information Retrieval
* Datasets: `dim_768`, `dim_512`, `dim_256`, `dim_128` and `dim_64`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | dim_768 | dim_512 | dim_256 | dim_128 | dim_64 |
|:--------------------|:-----------|:-----------|:----------|:-----------|:-----------|
| cosine_accuracy@1 | 0.4839 | 0.4839 | 0.4516 | 0.4409 | 0.3978 |
| cosine_accuracy@3 | 0.6989 | 0.7204 | 0.6882 | 0.6452 | 0.6022 |
| cosine_accuracy@5 | 0.7957 | 0.7849 | 0.7957 | 0.7634 | 0.7097 |
| cosine_accuracy@10 | 0.9247 | 0.9032 | 0.8817 | 0.8387 | 0.8065 |
| cosine_precision@1 | 0.4839 | 0.4839 | 0.4516 | 0.4409 | 0.3978 |
| cosine_precision@3 | 0.3799 | 0.3871 | 0.3656 | 0.3548 | 0.3405 |
| cosine_precision@5 | 0.2839 | 0.286 | 0.2796 | 0.2731 | 0.2602 |
| cosine_precision@10 | 0.172 | 0.1677 | 0.1656 | 0.1559 | 0.1538 |
| cosine_recall@1 | 0.2177 | 0.2231 | 0.2088 | 0.1873 | 0.1586 |
| cosine_recall@3 | 0.4884 | 0.5027 | 0.4718 | 0.4453 | 0.4059 |
| cosine_recall@5 | 0.5883 | 0.5936 | 0.5806 | 0.5726 | 0.526 |
| cosine_recall@10 | 0.7088 | 0.6944 | 0.6855 | 0.6541 | 0.6165 |
| **cosine_ndcg@10** | **0.5864** | **0.5845** | **0.565** | **0.5356** | **0.5019** |
| cosine_mrr@10 | 0.5963 | 0.595 | 0.5674 | 0.5453 | 0.5082 |
| cosine_map@100 | 0.4916 | 0.4987 | 0.4761 | 0.4511 | 0.4182 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### json
* Dataset: json
* Size: 842 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 842 samples:
| | anchor | positive |
|:--------|:-----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 24 tokens</li><li>mean: 42.46 tokens</li><li>max: 68 tokens</li></ul> | <ul><li>min: 236 tokens</li><li>mean: 962.01 tokens</li><li>max: 1056 tokens</li></ul> |
* Samples:
| anchor | positive |
|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Based on the court's ruling, under what circumstances can a college student be held accountable for off-campus speech, and how does this relate to the standards of professionalism in a professional school setting?</code> | <code>A serious question raised by Keefe in this case is whether the First Amendment protected his unprofessional speech from academic disadvantage because it was made in- on-line, off-campus Facebook postings. On appeal, Keefe framed this contention categorically, arguing that a college student may not be punished for off-campus speech unless it is speech that is unprotected by the First Amendment, such as obscenity. We reject this categorical contention. A student may demonstrate an unacceptable lack of professionalism off campus, as well as in the classroom, and by speech as well as conduct. See Yoder v. Univ. of Louisville, 526 Fed-Appx. 537, 545-46 (6th Cir.), cert. denied, — U.S. -, 134 S.Ct. 790, 187 L.Ed.2d 594 (2013); Tatro v. Univ. of Minn., 816 N.W.2d 509, 521 (Minn. 2012). Therefore, college administrators and educators in a professional school have discretion to require compliance with recognized standards of the profession, both on and off campus, “so long as their actions are ...</code> |
| <code>Describe the two-step framework that Courts of Appeals have developed for analyzing Second Amendment challenges. What are the implications of the Supreme Court's decision to reject this framework in favor of a historical tradition-based approach?</code> | <code>Petitioners sued respondents for declaratory and injunctive relief under…42 U.S.C. § 1983, alleging that respondents violated their Second and Fourteenth Amendment rights by denying their unrestricted-license applications on the basis that they had failed to show “proper cause,” i.e., had failed to demonstrate a unique need for self-defense.<br><br> <br><br>The District Court dismissed petitioners’ complaint and the Court of Appeals affirmed. [citation omitted] Both courts relied on [a] Court of Appeals’ prior decision…which had sustained New York’s proper-cause standard, holding that the requirement was “substantially related to the achievement of an important governmental interest.” [citation omitted]<br><br> <br><br>We granted certiorari to decide whether New York’s denial of petitioners’ license applications violated the Constitution. [citation omitted]<br><br> <br><br> <br><br>II<br><br>In Heller and McDonald, we held that the Second and Fourteenth Amendments protect an individual right to keep and bear arms for self-defense. ...</code> |
| <code>Discuss the implications of the California Alien Land Law as it pertains to the rights of American citizens, specifically in the case of Fred Oyama. How does the law affect his privileges as a citizen, and what constitutional protections are being challenged?</code> | <code>269<br><br>Supreme Court of the United States<br><br>OYAMA et al.<br><br>v.<br><br>STATE OF CALIFORNIA.<br><br>No. 44.<br><br>|<br><br>Argued Oct. 22, 1947.<br><br>|<br><br>Decided Jan. 19, 1948.<br><br>Opinion<br><br>*635 Mr. Chief Justice VINSON delivered the opinion of the Court.<br><br>Petitioners challenge the constitutionality of California’s Alien Land Law1 as it has been applied in this case to effect an escheat of two small parcels of agricultural land.2 One of the petitioners is Fred Oyama, a minor American citizen in whose name title was taken. The other is his father and guardian, Kajiro Oyama, a Japanese citizen not eligible for naturalization,3 who paid the purchase price.<br><br>Petitioners press three attacks on the Alien Land Law as it has been applied in this case: first, that it deprives Fred Oyama of the equal protection of the laws and of his privileges as an American citizen; secondly, that it denies Kajiro Oyama equal protection of the laws; and, thirdly, that it contravenes the due process clause by sanctioning a taking of property after ...</code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `gradient_accumulation_steps`: 32
- `learning_rate`: 2e-05
- `num_train_epochs`: 4
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `bf16`: True
- `tf32`: True
- `load_best_model_at_end`: True
- `optim`: adamw_torch_fused
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: epoch
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 32
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 2e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 4
- `max_steps`: -1
- `lr_scheduler_type`: cosine
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: True
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: True
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: True
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch_fused
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | dim_768_cosine_ndcg@10 | dim_512_cosine_ndcg@10 | dim_256_cosine_ndcg@10 | dim_128_cosine_ndcg@10 | dim_64_cosine_ndcg@10 |
|:----------:|:-----:|:----------------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|
| 0.6038 | 1 | 0.5604 | 0.5631 | 0.5303 | 0.4907 | 0.4335 |
| 1.6038 | 2 | 0.5836 | 0.5758 | 0.5715 | 0.5180 | 0.4846 |
| 2.6038 | 3 | 0.5768 | 0.5841 | 0.5652 | 0.5296 | 0.4940 |
| **3.6038** | **4** | **0.5864** | **0.5845** | **0.565** | **0.5356** | **0.5019** |
* The bold row denotes the saved checkpoint.
### Framework Versions
- Python: 3.11.11
- Sentence Transformers: 3.4.1
- Transformers: 4.48.3
- PyTorch: 2.6.0+cu124
- Accelerate: 1.3.0
- Datasets: 3.3.0
- Tokenizers: 0.21.0
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MatryoshkaLoss
```bibtex
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | {"base_model": "nomic-ai/modernbert-embed-base", "language": ["en"], "library_name": "sentence-transformers", "license": "apache-2.0", "metrics": ["cosine_accuracy@1", "cosine_accuracy@3", "cosine_accuracy@5", "cosine_accuracy@10", "cosine_precision@1", "cosine_precision@3", "cosine_precision@5", "cosine_precision@10", "cosine_recall@1", "cosine_recall@3", "cosine_recall@5", "cosine_recall@10", "cosine_ndcg@10", "cosine_mrr@10", "cosine_map@100"], "pipeline_tag": "sentence-similarity", "tags": ["sentence-transformers", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:842", "loss:MatryoshkaLoss", "loss:MultipleNegativesRankingLoss"], "widget": [{"source_sentence": "Discuss the implications of the Insular Cases on the application of the Citizenship Clause to American Samoa, particularly in distinguishing between incorporated and unincorporated territories. What are the practical concerns associated with this distinction?", "sentences": ["To the extent jus soli is adopted into the Fourteenth Amendment, the concept of allegiance is manifested by the Citizenship Clause’s mandate that birthright citizens not merely be born within the territorial boundaries of the United States but also “subject to the jurisdiction thereof…” [citations omitted]\n\n \n\n Appellants would find any allegiance requirement of no moment because, as non-citizen nationals, American Samoans already “owe[ ] permanent allegiance to the United States.”[citations omitted] Yet, within the context of the Citizenship Clause, “[t]he evident meaning of the[ ] ... words [“subject to the jurisdiction thereof”] is, not merely subject in some respect or degree to the jurisdiction of the United States, but completely subject to their political jurisdiction, and owing them direct and immediate allegiance.” **375 [citations omitted] *306 It was on this basis that the Supreme Court declined to extend constitutional birthright citizenship to Native American tribes. [citations omitted]…Even assuming a background context grounded in principles of jus soli, we are skeptical the framers plainly intended to extend birthright citizenship to distinct, significantly self-governing political territories within the United States’s sphere of sovereignty—even where, as is the case with American Samoa, ultimate governance remains statutorily vested with the United States Government. [citations omitted]\n\nIII\n\nAnalysis of the Citizenship Clause’s application to American Samoa would be incomplete absent invocation of the sometimes contentious Insular Cases, where the Supreme Court “addressed whether the Constitution, by its own force, applies in any territory that is not a State.” [citations omitted]\n\n \n\n“The doctrine of ‘territorial incorporation’ announced in the Insular Cases distinguishes between incorporated territories, which are intended for statehood from the time of acquisition and in which the entire Constitution applies ex proprio vigore, and unincorporated territories [such as American Samoa], which are not intended for statehood and in which only [certain] fundamental constitutional rights apply by their own force.”[citations omitted].\n\n \n\nAppellants and Amici contend the Insular Cases have no application because the Citizenship Clause textually defines its own scope.[citations omitted].\n\n \n\nAmici Curiae suggest territorial incorporation doctrine should not be expanded to the Citizenship Clause because the doctrine rests on anachronistic views of race and imperialism. But the Court has continued to invoke the Insular framework when dealing with questions of territorial and extraterritorial application. [citations omitted] Although some aspects of the Insular Cases’ analysis may now be deemed politically incorrect, the framework remains both applicable and of pragmatic use in assessing the applicability of rights to unincorporated territories. [citations omitted]\n\n \n\nAs the Supreme Court…emphasized, the “common thread uniting the Insular Cases ... [is that] questions of extraterritoriality turn on objective factors and practical concerns, not formalism.” [citations omitted] While “fundamental limitations in favor of personal rights” remain guaranteed to persons born in the unincorporated territories, [citations omitted], the Insular framework recognizes the difficulties that frequently inure when “determin[ing] [whether a] particular provision of the Constitution is applicable,” absent inquiry into the impractical or anomalous. [citations omitted]\n\nA\n\n American citizenship “is one of the most valuable rights in the world today.” [citations omitted] “The freedoms and opportunities secured by United States citizenship long have been treasured by persons fortunate enough to be born with them, and are yearned for by countless less fortunate.” [citations omitted]. Accordingly, even if the Insular framework is applicable, Appellants cite to a bevy of cases to argue citizenship is a fundamental right. [citations omitted] But those cases do not arise in the territorial context. Such decisions do not reflect the Court’s considered judgment as to the existence of a fundamental right to citizenship for persons born in the United States’ unincorporated **377 *308 territories. [citations omitted].7\n\n \n\n “Fundamental” has a distinct and narrow meaning in the context of territorial rights. It is not sufficient that a right be considered fundamentally important in a colloquial sense or even that a right be “necessary to [the] [ ]American regime of ordered liberty.” [citations omitted]. Under the Insular framework the designation of fundamental extends only to the narrow category of rights and “principles which are the basis of all free government.” [citations omitted]\n\n \n\nIn this manner the Insular Cases distinguish as universally fundamental those rights so basic as to be integral to free and fair society.", "633, 649 (concurring opinion).\n\nAn innkeeper or common carrier has always been allowed to' exclude drunks, criminals and' diseased persons, but only because the public’s interest in protecting his and his guests’ health and property outweighs its interest in providing accommodations for this small group of travelers. As a general rule, innkeepers and carriers cannot refuse their services on account of race; though the rule developed in this country that they can provide “separate but equal” facilities. And for a period of our history even,this Court upheld state laws giving sanction to such a rule. Compare Plessy v. Ferguson, 163 U. S. 537, with Gayle v. Browder, 352 U. S. 903, affirming, 142 F. Supp. 707. But surely Shelley v. Kraemer, supra, and Barrows v. Jackson, supra, show that the day has passed when an innkeeper, carrier, housing developer, or retailer can draw a• racial' line, refuse service to some on account of color, and obtain the aid of a State in enforcing his personal bias by sending outlawed customers to prison or exacting fines from them.\n\nBusiness, such as this restaurant, is still private property. ' Yet there is hardly any private enterprise that does not feel the pinch of some public regulation — from price control, to health and fire inspection, to zoning, to safety measures, to minimum wages and working conditions, to unemployment insurance. When the doors of a business are open to the public, they must be open to all regardless of race if apartheid is not to become engrained in our public places. It cannot by reason of the Equal Protection Clause become so engrained with the aid of state courts, state legislatures, or state police.\n\nII.\n\nThere is even greater reason to bar a State through its judiciary from throwing its weight on the side of racial discrimination in the present case, because we deal here with a place of public accommodation under license from, the State. This is the idea I expressed in Garner v. Louisiana, 368 U. S. 157, where another owner of a restaurant refused service to a customer because he was a Negro. That view is not novel; it.stems from the dissent of the first Mr. Justice Harlan in the Civil Rights Cases, 109 U. S. 3, 58-59:\n\n“In every material sense applicable to the practical enforcement of the Fourteenth Amendment, railroad corporations, keepers of inns, and managers of places of public amusement are agents or instrumentalities of the State, because they are charged with duties to the public, and are amenable, in respect of their duties and functions, to governmental regulation. It seems to me that, within the principle settled in Ex parte Virginia, a denial, by these instrumentalities of the State, to the citizen, because of his race, of that equality of civil rights secured to him by law, is a denial by the State, within the meaning of the Fourteenth Amendment. If it be not, then that race is left, in respect of the civil rights in question, practically at the mercy of corporations and individuals wielding power under the States.”\n\nThe nexus between the State and the private enterprise may be control, as in the case of a state agency. Pennsylvania v. Board of Trusts, 353 U. S. 230. Or the nexus may be one of numerous other devices. “State support of segregated schools through any arrangement, management, funds, or property cannot be squared” with the Equal Protection Clause. Cooper v. Aaron, 358 U. S. 1, 19. Cf. Hampton v. Jacksonville, 304 F. 2d 320. A state-assisted enterprise serving the public does not escape its constitutional duty to serve all customers irrespective of race, even though its actual operation is in the hands of a lessee. Burton v. Wilmington Parking Authority, 365 U. S. 715. Cf. Boynton v. Virginia, 364 U. S. 454. State licensing and surveillance.of a business serving the public also brings its service into the public domain. This restaurant needs a permit from Louisiana to operate; and during the existence of the license the State has broad powers of visitation and control. This restaurant is thus an instrumentality of the State since the State charges it with duties to the public and supervises its performance. The State's interest in and activity with regard to its restaurants extends far beyond any mere income-producing licensing requirement.", "Among other things, courts at this second step have sometimes considered whether an employee’s speech interests are outweighed by “ ‘the interest of the State, as an employer, in promoting the efficiency of the public services it performs through its employees.’ ” Id., at 417, 126 S.Ct. 1951 *2424 (quoting Pickering, 391 U.S. at 568, 88 S.Ct. 1731).\n\n \n\nBoth sides ask us to employ at least certain aspects of this Pickering–Garcetti framework to resolve Mr. Kennedy’s free speech claim. They share additional common ground too. They agree that Mr. Kennedy’s speech implicates a matter of public concern. See App. to Pet. for Cert. 183; Brief for Respondent 44. They also appear to accept, at least for argument’s sake, that Mr. Kennedy’s speech does not raise questions of academic freedom that may or may not involve “additional” First Amendment “interests” beyond those captured by this framework. Garcetti, 547 U.S. at 425, 126 S.Ct. 1951; see also Keyishian v. Board of Regents of Univ. of State of N. Y., 385 U.S. 589, 603, 87 S.Ct. 675, 17 L.Ed.2d 629 (1967); Brief for Petitioner 26, n. 2. At the first step of the Pickering–Garcetti inquiry, the parties’ disagreement thus turns out to center on one question alone: Did Mr. Kennedy offer his prayers in his capacity as a private citizen, or did they amount to government speech attributable to the District?\n\n \n\nOur cases offer some helpful guidance for resolving this question. In Garcetti, the Court concluded that a prosecutor’s internal memorandum to a supervisor was made “pursuant to [his] official duties,” and thus ineligible for First Amendment protection. 547 U.S. at 421, 126 S.Ct. 1951. In reaching this conclusion, the Court relied on the fact that the prosecutor’s speech “fulfill[ed] a responsibility to advise his supervisor about how best to proceed with a pending case.” Ibid. In other words, the prosecutor’s memorandum was government speech because it was speech the government “itself ha[d] commissioned or created” and speech the employee was expected to deliver in the course of carrying out his job. Id., at 422, 126 S.Ct. 1951.\n\n \n\nBy contrast, in Lane a public employer sought to terminate an employee after he testified at a criminal trial about matters involving his government employment. 573 U.S. at 233, 134 S.Ct. 2369. The Court held that the employee’s speech was protected by the First Amendment. Id., at 231, 134 S.Ct. 2369. In doing so, the Court held that the fact the speech touched on matters related to public employment was not enough to render it government speech. Id., at 239–240, 134 S.Ct. 2369. Instead, the Court explained, the “critical question ... is whether the speech at issue is itself ordinarily within the scope of an employee’s duties.” Id., at 240, 134 S.Ct. 2369. It is an inquiry this Court has said should be undertaken “practical[ly],” rather than with a blinkered focus on the terms of some formal and capacious written job description. Garcetti, 547 U.S. at 424, 126 S.Ct. 1951. To proceed otherwise would be to allow public employers to use “excessively broad job descriptions” to subvert the Constitution’s protections. Ibid.\n\n \n\nApplying these lessons here, it seems clear to us that Mr. Kennedy has demonstrated that his speech was private speech, not government speech. When Mr. Kennedy uttered the three prayers that resulted in his suspension, he was not engaged in speech “ordinarily within the scope” of his duties as a coach. Lane, 573 U.S. at 240, 134 S.Ct. 2369. He did not speak pursuant to government policy. He was not seeking to convey a government-created message. He was not instructing players, discussing strategy, encouraging better on-field performance, or engaged in any other speech the District paid him to produce as a coach. See Part I–B, supra. Simply put: Mr. Kennedy’s prayers did not “ow[e their] existence” to Mr. Kennedy’s responsibilities as a public employee."]}, {"source_sentence": "Discuss the implications of the Thirteenth Amendment as it relates to Congress's power to enact laws against private racial discrimination in property transactions. How does the text support the assertion that Congress's authority extends beyond state action?", "sentences": ["––––, ––––, 142 S.Ct. 1539, 1545, ––– L.Ed.2d –––– (2022) (THOMAS, J., concurring) (internal quotation*2301 marks omitted). Either way, the Due Process Clause at most guarantees process. It does not, as the Court’s substantive due process cases suppose, “forbi[d] the government to infringe certain ‘fundamental’ liberty interests at all, no matter what process is provided.” Reno v. Flores, 507 U.S. 292, 302, 113 S.Ct. 1439, 123 L.Ed.2d 1 (1993); see also, e.g.,Collins v. Harker Heights, 503 U.S. 115, 125, 112 S.Ct. 1061, 117 L.Ed.2d 261 (1992).\n\n \n\nAs I have previously explained, “substantive due process” is an oxymoron that “lack[s] any basis in the Constitution.” Johnson, 576 U.S. at 607–608, 135 S.Ct. 2551 (opinion of THOMAS, J.); see also, e.g.,Vaello Madero, 596 U.S., at ––––, 142 S.Ct., at 1545 (THOMAS, J., concurring) (“[T]ext and history provide little support for modern substantive due process doctrine”). “The notion that a constitutional provision that guarantees only ‘process’ before a person is deprived of life, liberty, or property could define the substance of those rights strains credulity for even the most casual user of words.” McDonald v. Chicago, 561 U.S. 742, 811, 130 S.Ct. 3020, 177 L.Ed.2d 894 (2010) (THOMAS, J., concurring in part and concurring in judgment); see also United States v. Carlton, 512 U.S. 26, 40, 114 S.Ct. 2018, 129 L.Ed.2d 22 (1994) (Scalia, J., concurring in judgment). The resolution of this case is thus straightforward. Because the Due Process Clause does not secure any substantive rights, it does not secure a right to abortion.\n\n \n\nThe Court today declines to disturb substantive due process jurisprudence generally or the doctrine’s application in other, specific contexts. Cases like Griswold v. Connecticut, 381 U.S. 479, 85 S.Ct. 1678, 14 L.Ed.2d 510 (1965) (right of married persons to obtain contraceptives)*; Lawrence v. Texas, 539 U.S. 558, 123 S.Ct. 2472, 156 L.Ed.2d 508 (2003) (right to engage in private, consensual sexual acts); and Obergefell v. Hodges, 576 U.S. 644, 135 S.Ct. 2584, 192 L.Ed.2d 609 (2015) (right to same-sex marriage), are not at issue. The Court’s abortion cases are unique, see ante, at 2257 – 2258, 2277 – 2278, 2280 – 2281, and no party has asked us to decide “whether our entire Fourteenth Amendment jurisprudence must be preserved or revised,” McDonald, 561 U.S. at 813, 130 S.Ct. 3020 (opinion of THOMAS, J.). Thus, I agree that “[n]othing in [the Court’s] opinion should be understood to cast doubt on precedents that do not concern abortion.” Ante, at 2277 – 2278.\n\n \n\nFor that reason, in future cases, we should reconsider all of this Court’s substantive due process precedents, including Griswold, Lawrence, and Obergefell. Because any substantive due process decision is “demonstrably erroneous,” Ramos v.Louisiana, 590 U.S. ––––, ––––, 140 S.Ct. 1390, 1424, 206 L.Ed.2d 583 (2020) (THOMAS, J., concurring in judgment), we have a duty to “correct the error” established in those precedents, Gamble v. United States, 587 U.S. ––––, ––––, 139 S.Ct. 1960, 1984-1985, 204 L.Ed.2d 322 (2019) (THOMAS, J., concurring).", "On October 21, the superintendent further observed to a state official that “[t]he issue is quickly changing as it has shifted from leading prayer with student athletes, to a coaches [sic] right to conduct” his own prayer “on the 50 yard line.” Id., at 88.\n\n \n\nOn October 23, shortly before that evening’s game, the District wrote Mr. Kennedy again. It expressed “appreciation” for his “efforts to comply” with the District’s directives, including avoiding “on-the-job prayer with players in the ... football program, both in the locker room prior to games as well as on the field immediately following games.” Id., at 90. The letter also admitted that, during Mr. Kennedy’s recent October 16 postgame prayer, his students were otherwise engaged and not praying with him, and that his prayer was “fleeting.” Id., at 90, 93. Still, the District explained that a “reasonable observer” could think government endorsement of religion had occurred when a “District employee, on the field only by virtue of his employment with the District, still on duty” engaged in “overtly religious conduct.” Id., at 91, 93. The District thus made clear that the only option it would offer Mr. Kennedy was to allow him to pray after a game in a “private location” behind closed doors and “not observable to students or the public.” Id., at 93–94.\n\n \n\nAfter the October 23 game ended, Mr. Kennedy knelt at the 50-yard line, where “no one joined him,” and bowed his head for a “brief, quiet prayer.” 991 F.3d at 1019; App. 173, 236–239. The superintendent informed the District’s board that this prayer “moved closer to what we want,” but nevertheless remained “unconstitutional.” Id., at 96. After the final relevant football game on October 26, Mr. Kennedy again knelt alone to offer a brief prayer as the players engaged in postgame traditions. 443 F.Supp.3d 1223, 1231 (W.D. Wash. 2020); App. to Pet. for Cert. 182. While he was praying, other adults gathered around him on the field. See 443 F.Supp.3d at 1231; App. 97. Later, Mr. Kennedy rejoined his players for a postgame talk, after they had finished singing the school fight song. 443 F.Supp.3d at 1231; App. 103.\n\n \n\n \n\nC\n\nShortly after the October 26 game, the District placed Mr. Kennedy on paid administrative *2419 leave and prohibited him from “participat[ing], in any capacity, in ... football program activities.” Ibid. In a letter explaining the reasons for this disciplinary action, the superintendent criticized Mr. Kennedy for engaging in “public and demonstrative religious conduct while still on duty as an assistant coach” by offering a prayer following the games on October 16, 23, and 26. Id., at 102. The letter did not allege that Mr. Kennedy performed these prayers with students, and it acknowledged that his prayers took place while students were engaged in unrelated postgame activities. Id., at 103. Additionally, the letter faulted Mr. Kennedy for not being willing to pray behind closed doors. Id., at 102.\n\n \n\nIn an October 28 Q&A document provided to the public, the District admitted that it possessed “no evidence that students have been directly coerced to pray with Kennedy.” Id., at 105. The Q&A also acknowledged that Mr. Kennedy “ha[d] complied” with the District’s instruction to refrain from his “prior practices of leading players in a pre-game prayer in the locker room or leading players in a post-game prayer immediately following games.” Ibid. But the Q&A asserted that the District could not allow Mr. Kennedy to “engage in a public religious display.” Id., at 105, 107, 110. Otherwise, the District would “violat[e] the ... Establishment Clause” because “reasonable ... students and attendees” might perceive the “district [as] endors[ing] ... religion.” Id., at 105.\n\n \n\nWhile Mr. Kennedy received “uniformly positive evaluations” every other year of his coaching career, after the 2015 season ended in November, the District gave him a poor performance evaluation. Kennedy v. Bremerton School Dist., 869 F.3d 813, 820 (C.A.9 2017).", "Nor was the scope of the 1866 Act altered when it was re-enacted in 1870, some two years after the ratification of the Fourteenth Amendment.71 It is quite true that some members of Congress supported the Fourteenth Amendment “in order to eliminate doubt as to the constitutional validity of the Civil Rights Act as applied to the States.” Hurd v. Hodge, 334 U.S. 24, 32—33, 68 S.Ct. 847, 852. But it certainly does not follow that the adoption of the Fourteenth Amendment or the subsequent readoption of the Civil Rights Act were meant somehow to limit its application to state action. The legislative history furnishes not the slightest factual basis for any such speculation, and the conditions prevailing in 1870 make it highly implausible. For by that time most, if not all, of the former Confederate States, then under the control of “reconstructed” legislatures, had formally repudiated racial discrimination, and the focus of congressional concern had clearly shifted from hostile statutes to the activities of groups like the Ku Klux Klan, operating wholly outside the law.72\n\n \n\n **2202 *437 Against this background, it would obviously make no sense to assume, without any historical support whatever, that Congress made a silent decision in 1870 to exempt private discrimination from the operation of the Civil Rights Act of 1866.73 “The cardinal rule is that repeals by implication are not favored.” Posadas v. National City Bank, 296 U.S. 497, 503, 56 S.Ct. 349, 352, 80 L.Ed. 351. All Congress said in 1870 was that the 1866 law “is hereby re-enacted.” That is all Congress meant.\n\n \n\n As we said in a somewhat different setting two Terms ago, “We think that history leaves no doubt that, if we are to give (the law) the scope that its origins dictate, we must accord it a sweep as broad as its language.” United States v. Price, 383 U.S. 787, 801, 86 S.Ct. 1152, 1160. “We are not at liberty to seek ingenious analytical instruments,” ibid., to carve from s 1982 an exception for private conduct—even though its application to such conduct in the present context is without established precedent. And, as the Attorney General of the United States said at the oral argument of this case, “The fact that the statute lay partially dormant for many years cannot be held to diminish its force today.”\n\n \n\n \n\n \n\nV.\n\nThe remaining question is whether Congress has power under the Constitution to do what s 1982 purports to do: to prohibit all racial discrimination, private and public, in the sale and rental of property. Our starting point is the Thirteenth Amendment, for it was pursuant *438 to that constitutional provision that Congress originally enacted what is now s 1982. The Amendment consists of two parts. Section 1 states:\n\n“Neither slavery nor involuntary servitude, except as a punishment for crime whereby the party shall have been duly convicted, shall exist within the United States, or any place subject to their jurisdiction.”\n\nSection 2 provides:\n\n“Congress shall have power to enforce this article by appropriate legislation.”\n\n As its text reveals, the Thirteenth Amendment “is not a mere prohibition of state laws establishing or upholding slavery, but an absolute declaration that slavery or involuntary servitude shall not exist in any part of the United States.” Civil Rights Cases, 109 U.S. 3, 20, 3 S.Ct. 18, 28, 27 L.Ed. 835. It has never been doubted, therefore, “that the power vested in Congress to enforce the article by appropriate legislation,” ibid., includes the power to enact laws “direct and primary, operating upon the acts of individuals, whether sanctioned by state legislation or not.” Id., at 23, 3 S.Ct., at 30.74\n\n \n\n Thus, the fact that s 1982 operates upon the unofficial acts of private individuals, whether or not sanctioned by state law, presents no constitutional problem. If Congress has power **2203 under the Thirteenth Amendment to eradicate conditions that prevent Negroes from buying and renting property because of their race or color, then no federal statute calculated to achieve that objective *439 can be thought to exceed the constitutional power of Congress simply because it reaches beyond state action to regulate the conduct of private individuals. The constitutional question in this case, therefore, comes to this: Does the authority of Congress to enforce the Thirteenth Amendment “by appropriate legislation” include the power to eliminate all racial barriers to the acquisition of real and personal property? We think the answer to that question is plainly yes."]}, {"source_sentence": "According to the statute referenced in the context, what is the standard for establishing the requisite injury necessary for obtaining an injunction under 17 U.S.C. § 1203(b)(1)?", "sentences": ["Post-Trial Mem. at 27-28.\n\n[263] The statute expressly authorizes injunctions to prevent or restrain violations, 17 U.S.C. § 1203(b)(1), thus demonstrating that the requisite injury need only be threatened.\n\n[264] Def. Post-Trial Mem. at 28.\n\n[265] Id. at 28-29.\n\n[266] See, e.g., Ex. AYZ (Hunt Dep.) at 94-104.\n\n[267] Id. 30.\n\n[268] Ex. 113.\n\n[269] Defendants' argument would lack merit even if there were credible proof that other circumvention devices actually exist and produce results comparable to DeCSS. The available movies must have been decrypted with DeCSS or something else. As far as this record discloses, any such device or technology would violate the DMCA for the same reasons as does DeCSS. In consequence, this case comes within the principle of Summers v. Tice, 33 Cal.2d 80, 199 P.2d 1 (1948). Where, as here, two or more persons take substantially identical wrongful actions, one and only one of which had to be the source of the plaintiffs' injury, and it is equally likely that one inflicted the injury as the other, the burden of proof on causation shifts to the defendants, each of which is liable absent proof that its action did not cause the injury. See 4 Fowler V. Harper & Fleming James, Jr., THE LAW OF TORTS §§ 101-04 (2d ed.1996).\n\nDefendants' efforts to avoid the consequences of this common sense principle are unpersuasive. They argue, for example, that plaintiffs may not invoke the theory unless they join as defendants everyone who may have contributed to the injury. Def. Post-Trial Mem. at 32 n. 18 (citing Ex. UZ). It would be difficult to imagine a more nonsensical requirement in the context of this case. Where, as here, harm is done by dissemination of information over the Internet, probably by a substantial number of people all over the world, defendants' proposed rule would foreclose judicial relief anywhere because joinder of all plainly would be impossible in any one place, and technology does not permit identification of which wrongdoer's posting or product led to which pirated copy of a copyrighted work.\n\n[270] 17 U.S.C. § 1203(b)(1).\n\n[271] See, e.g., S.E.C. v. Unique Financial Concepts, Inc., 196 F.3d 1195, 1199 n. 2 (11th Cir.1999) (injunction under Section 20(b) of the Securities Act of 1933, 15 U.S.C. § 77t(b), which permits an injunction \"upon a proper showing,\" requires \"a reasonable likelihood that the wrong will be repeated\"); Commodity Futures Trading Com'n v. Hunt, 591 F.2d 1211, 1220 (7th Cir.1979) (same under Commodity Exchange Act, 7 U.S.C. § 13a-1(b)); S.E.C. v. Bausch & Lomb Inc., 565 F.2d 8, 18 (2d Cir.1977) (reasonable likelihood of future violations required under § 21(d) of Securities Exchange Act of 1934, 15 U.S.C. § 78u(d), which permits an injunction \"upon a proper showing\" where person \"engaged or ... about to engage in\" violation of statute).\n\n[272] See, e.g., Rondeau v. Mosinee Paper Corp., 422 U.S. 49, 57, 95 S.Ct. 2069, 45 L.Ed.2d 12 (1975) (injunctive relief in private action under § 13(d) of the Securities Exchange Act of 1934, 15 U.S.C. § 78m(d), as added by the Williams Act, requires a showing of irreparable harm and inadequacy of legal remedies).\n\n[273] Tough Traveler, Ltd. v. Outbound Prods., 60 F.3d 964, 967-68 (2d Cir.1995) (trademark); Fisher-Price, Inc. v. Well-Made Toy Mfg. Corp., 25 F.3d 119, 124 (2d Cir.1994) (copyright).\n\n[274] See, e.g., Northwestern Nat'l Ins. Co.", "Indeed, were we to accept Maine’s argument, our decision in Espinoza would be rendered essentially meaningless. By Maine’s logic, Montana could have obtained the same result that we held violated the First Amendment simply by redefining its tax credit for sponsors of generally available scholarships as limited to “tuition payments for the rough equivalent of a Montana public education”—meaning a secular education. But our holding in Espinoza turned on the substance of free exercise protections, not on the presence or absence of magic words. That holding applies fully whether the prohibited discrimination is in an express provision like § 2951(2) or in a party’s reconceptualization of the public benefit.\n\n \n\nMaine may provide a strictly secular education in its public schools. But BCS and Temple Academy—like numerous other recipients of Maine tuition assistance payments—are not public schools. In order to provide an education to children who live in certain parts of its far-flung State, Maine has decided not to operate schools of its own, but instead to offer tuition assistance that parents may direct to the public or private schools of their choice. Maine’s administration of that benefit is subject to the free exercise principles governing any such public benefit program—including the prohibition on denying the benefit based on a recipient’s religious exercise.\n\n \n\nThe dissents are wrong to say that under our decision today Maine “must” fund religious education. Post, at 2006 (BREYER, J., dissenting). Maine chose to allow some parents to direct state tuition payments to private schools; that decision was not “forced upon” it. Post, at 2014 (SOTOMAYOR, J., dissenting). The State retains a number of options: it could expand the reach of its public school system, increase the availability of transportation, provide some combination of tutoring, remote learning, and partial attendance, or even operate boarding schools of its own. As we held in Espinoza, a “State need not subsidize private education. But once a State decides to do so, it cannot disqualify some private schools solely because they are religious.” 591 U. S., at ––––, 140 S.Ct., at 2261.\n\n \n\n \n\nB\n\nThe Court of Appeals also attempted to distinguish this case from Trinity Lutheran and Espinoza on the ground that the funding restrictions in those cases were “solely status-based religious discrimination,” while the challenged provision here “imposes a use-based restriction.” 979 F.3d at 35, 37–38...\n\n \n\nIn Trinity Lutheran, the Missouri Constitution banned the use of public funds in aid of “any church, sect or denomination of religion.” [citation omitted]. We noted that the case involved “express discrimination based on religious identity,” which was sufficient unto the day in deciding it, and that our opinion did “not address religious uses of funding.” [citation omitted]\n\n \n\nSo too in Espinoza, the discrimination at issue was described by the Montana Supreme Court as a prohibition on aiding “schools controlled by churches,” and we *2001 analyzed the issue in terms of “religious status and not religious use.” [citation omitted] Foreshadowing Maine’s argument here, Montana argued that its case was different from Trinity Lutheran’s because it involved not playground resurfacing, but general funds that “could be used for religious ends by some recipients, particularly schools that believe faith should ‘permeate[ ]’ everything they do.” [citation omitted] We explained, however, that the strict scrutiny triggered by status-based discrimination could not be avoided by arguing that “one of its goals or effects [was] preventing religious organizations from putting aid to religious uses.” [citation omitted] And we noted that nothing in our analysis was “meant to suggest that we agree[d] with [Montana] that some lesser degree of scrutiny applies to discrimination against religious uses of government aid.” [citation omitted]\n\n \n\nMaine’s argument, however—along with the decision below and Justice BREYER’s dissent—is premised on precisely such a distinction. [citations omitted]\n\n \n\nThat premise, however, misreads our precedents. In Trinity Lutheran and Espinoza, we held that the Free Exercise Clause forbids discrimination on the basis of religious status. But those decisions never suggested that use-based discrimination is any less offensive to the Free Exercise Clause. This case illustrates why.", "429\n\nSupreme Court of the United States.\n\nSAMUEL M. CLYATT\n\nv.\n\nUNITED STATES.\n\nNo. 235.\n\n|\n\nArgued December 13, 14, 1904.\n\n|\n\nDecided March 13, 1905.\n\nSynopsis\n\nON WRIT of Certiorari to the United States Circuit Court of Appeals for the Fifth Circuit, bringing up for review a judgment of the Circuit Court for the Northern District of Florida, convicting defendant of returning certain specified persons to a condition of peonage, which judgment had been taken to the Circuit Court of Appeals by a writ of error to the Circuit Court. Reversed and the cause remanded for a new trial.\n\n \n\n**429 Statement by Mr. Justice Brewer:\n\nConsiders the constitutionality of Sections 1990 and 5526, Rev. Stat. (U. S. Comp. Stat. 1901, pp. 1266, 3715), [Anti-Peonage Act]\n\n*215 Mr. Justice Brewer delivered the opinion of the court:\n\n \n\n…What is peonage? It may be defined as a status or condition of compulsory service, based upon the indebtedness of the peon to the master. The basal fact is indebtedness. As said by Judge Benedict, delivering the opinion in Jaremillo v. Romero, 1 N. M. 190, 194: ‘One fact existed universally: all were indebted to their masters. This was the cord by which they seemed bound to their master’s service.’ Upon this is based a condition of compulsory service. Peonage is sometimes classified as voluntary or involuntary; but this implies simply a difference in the mode of origin, but none in the character of the servitude. The one exists where the debtor voluntarily contracts to enter the service of his creditor. The other is forced upon the debtor by some provision of law. But peonage, however created, is compulsory service,—involuntary servitude. The peon can release himself therefrom, it is true, by the payment of the debt, but otherwise the service is enforced. A clear distinction exists between peonage and the voluntary performance of labor or rendering of services in payment of a debt. In the latter case the debtor, though contracting to pay his indebtedness by labor or service, and subject, like any other contractor, to an action for damages for breach of that contract, can elect at any time to break it, and no law or force compels *216 performance or a continuance of the service. We need not stop to consider any possible limits or exceptional cases, such as the service of a sailor…or the obligations of a child to its parents, or of an apprentice to his master, or the power of the legislature to make unlawful, and punish criminally, an abandonment by an employee of his post of labor in any extreme cases. That which is contemplated by the statute is compulsory service to secure the payment of a debt. Is this legislation within the power of Congress? It may be conceded, as a general proposition, that the ordinary relations of individual to individual are subject to the control of the states, and are not intrusted to the general government; but the 13th Amendment, adopted as an outcome of the Civil War, reads:\n\n‘Sec. 1. Neither slavery nor involuntary servitude, except as a punishment for crime whereof the party shall have been duly convicted, shall exist within the United States, or any place subject to their jurisdiction.\n\n‘Sec. 2. Congress shall have power to enforce this article by appropriate legislation.’\n\nThis amendment denounces a status or condition, irrespective of the manner or authority by which it is created. The prohibitions of the 14th and 15th Amendments are largely upon the acts of the states; but the 13th Amendment names no party or authority, but simply forbids slavery and involuntary servitude, grants to Congress power to enforce this prohibition by appropriate legislation. The differences between the 13th and subsequent amendments [can be described as follows:]\n\nThis amendment, as well as the 14th, is undoubtedly self-executing without any ancillary legislation, so far as its terms are applicable to any existing state of circumstances. By its own unaided force and effect it abolished slavery, and *217 established universal freedom. Still, legislation may be necessary and proper to meet all the various cases and circumstances to be affected by it, and to prescribe proper modes of redress for its violation in letter or spirit. And such legislation may be primary and direct in its character; for the amendment is not a mere prohibition of state laws establishing or upholding slavery, but an absolute declaration that slavery or involuntary servitude shall not exist in any part of the United States. . . ."]}, {"source_sentence": "How does the standard for applying the Second Amendment, as outlined in the context, compare to the protection of other constitutional rights, such as the freedom of speech in the First Amendment?", "sentences": ["Eventually, HCC moved to dismiss the complaint. The District Court granted the motion, concluding that Mr. Wilson lacked standing under Article III. On appeal, a panel of the Fifth Circuit reversed, holding that Mr. Wilson had standing and that his complaint stated a viable First Amendment claim. [citation omitted]\n\n \n\nThe Fifth Circuit’s merits analysis proceeded in two steps. First, the court concluded that a verbal “reprimand against an elected official for speech addressing a matter of public concern is an actionable First Amendment claim under § 1983.” [citation omitted] Next, the court reasoned that the Board’s imposition of other punishments—such as limiting Mr. Wilson’s eligibility for officer positions and his access to certain funds—did “not violate his First Amendment rights” because Mr. Wilson did not have an “entitlement” to those privileges. [citation omitted] In sum, the court held that Mr. Wilson’s § 1983 action could proceed, but only as to the Board’s unadorned censure resolution. HCC’s request for rehearing en banc failed by an equally divided vote. [citation omitted].\n\n \n\nIn time, HCC filed a petition for certiorari in this Court. It asked us to review the Fifth Circuit’s judgment that Mr. Wilson may pursue a First Amendment claim based on a purely verbal censure. Last year, we agreed to take up that question. [citation omitted] But as merits briefing unfolded, Mr. Wilson did not just seek to defend the Fifth Circuit’s judgment; he also sought to challenge it in part. Specifically, he argued that the Fifth Circuit erred to the extent that it upheld the Board’s nonverbal punishments as consistent with the First Amendment. Generally, however, when a respondent in this Court seeks to alter a lower court’s judgment, he must file and we must grant a cross-petition for review. [citation omitted] Mr. Wilson filed no such petition in this case. As a result, we decline to take up his *1259 challenge to the Fifth Circuit’s judgment, and the only question before us remains the narrow one on which we granted certiorari: Does Mr. Wilson possess an actionable First Amendment claim arising from the Board’s purely verbal censure?\n\n \n\n \n\nII\n\nA\n\nThe First Amendment prohibits laws “abridging the freedom of speech.” One obvious implication of that rule is that the government usually may not impose prior restraints on speech. [citation omitted] But other implications follow too. Relevant here, no one before us questions that, “[a]s a general matter,” the First Amendment prohibits government officials from subjecting individuals to “retaliatory actions” after the fact for having engaged in protected speech. [citations omitted] Mr. Wilson argues that the Board’s censure resolution represents exactly that kind of impermissible retaliatory action.\n\n \n\nAlmost immediately, however, this submission confronts a challenge. When faced with a dispute about the Constitution’s meaning or application, “[l]ong settled and established practice is a consideration of great weight.” [citation omitted] Often, “a regular course of practice” can illuminate or “liquidate” our founding document’s “terms & phrases.” [citations omitted] That principle poses a problem for Mr. Wilson because elected bodies in this country have long exercised the power to censure their members. In fact, no one before us has cited any evidence suggesting that a purely verbal censure analogous to Mr. Wilson’s has ever been widely considered offensive to the First Amendment.\n\n \n\nAs early as colonial times, the power of assemblies in this country to censure their members was “more or less assumed.” [citation omitted] It seems, too, that assemblies often exercised the power to censure members for views they expressed and actions they took “both within and without the legislature.” [citations omitted]\n\n \n\nThe parties supply little reason to think the First Amendment was designed or commonly understood to upend this practice…\n\n \n\n \n\nIf anything, censures [of public officials] have proven more common yet at the state and local level…According to HCC and undisputed by Mr. Wilson, it seems elected bodies in this country issued no fewer than 20 censures in August 2020 alone. [citation omitted]\n\n \n\nIf this longstanding practice does not “put at rest” the question of the Constitution’s meaning for the dispute before us, it surely leaves a “considerable impression.” [citation omitted] On Mr. Wilson’s telling and under the Fifth Circuit’s holding, a purely verbal censure by an elected assembly of one of its own members may offend the First Amendment.", "[citation omitted]\n\n \n\nWe assessed the lawfulness of that handgun ban by scrutinizing whether it comported with history and tradition. Although we noted that the ban “would fail constitutional muster” “[u]nder any of the standards of scrutiny that we have applied to enumerated constitutional rights,”…we did not engage in means-end scrutiny when resolving the constitutional question. Instead, we focused on the historically unprecedented nature of the District’s ban, observing that “[f]ew laws in the history of our Nation have come close to [that] severe restriction.” [citation omitted] Likewise, when one of the dissents attempted to justify the District’s prohibition with “founding-era historical precedent,” including “various restrictive laws in the colonial period,” we addressed each purported analogue and concluded that they were either irrelevant or “d[id] not remotely burden the right of self-defense as much as an absolute ban on handguns.” [citations omitted] Thus, our earlier historical analysis sufficed to show that the Second Amendment did not countenance a “complete prohibition” on the use of “the most popular weapon chosen by Americans for self-defense in the home.” [citation omitted]\n\n \n\n \n\n2\n\nAs the foregoing shows, Heller’s methodology centered on constitutional text and *2129 history. Whether it came to defining the character of the right (individual or militia dependent), suggesting the outer limits of the right, or assessing the constitutionality of a particular regulation, Heller relied on text and history. It did not invoke any means-end test such as strict or intermediate scrutiny.\n\n \n\nMoreover, Heller and McDonald expressly rejected the application of any “judge-empowering ‘interest-balancing inquiry’ that ‘asks whether the statute burdens a protected interest in a way or to an extent that is out of proportion to the statute’s salutary effects upon other important governmental interests.’ ” [citations omitted] We declined to engage in means-end scrutiny because “[t]he very enumeration of the right takes out of the hands of government—even the Third Branch of Government—the power to decide on a case-by-case basis whether the right is really worth insisting upon.” [citation omitted] We then concluded: “A constitutional guarantee subject to future judges’ assessments of its usefulness is no constitutional guarantee at all.” [citation omitted]\n\n \n\nNot only did Heller decline to engage in means-end scrutiny generally, but it also specifically ruled out the intermediate-scrutiny test that respondents and the United States now urge us to adopt. Dissenting in Heller, Justice BREYER’s proposed standard—“ask[ing] whether [a] statute burdens a protected interest in a way or to an extent that is out of proportion to the statute’s salutary effects upon other important governmental interests,” …—simply expressed a classic formulation of intermediate scrutiny in a slightly different way. [ci8tations omitted] In fact, Justice BREYER all but admitted that his Heller dissent advocated for intermediate scrutiny by repeatedly invoking a quintessential intermediate-scrutiny precedent. [citations omitted]Thus, when Heller expressly rejected that dissent’s “interest-balancing inquiry,” [citation omitted] it necessarily rejected intermediate scrutiny.5\n\n \n\nIn sum, the Courts of Appeals’ second step is inconsistent with Heller’s historical approach and its rejection of means-end scrutiny. We reiterate that the standard for applying the Second Amendment is as follows: When the Second Amendment’s plain text covers an individual’s *2130 conduct, the Constitution presumptively protects that conduct. The government must then justify its regulation by demonstrating that it is consistent with the Nation’s historical tradition of firearm regulation. Only then may a court conclude that the individual’s conduct falls outside the Second Amendment’s “unqualified command.” [citation omitted]\n\n \n\n \n\nC\n\nThis Second Amendment standard accords with how we protect other constitutional rights. [One example is] the freedom of speech in the First Amendment, to which Heller repeatedly compared the right to keep and bear arms. [citation omitted] In that context, “[w]hen the Government restricts speech, the Government bears the burden of proving the constitutionality of its actions.” [citations omitted] In some cases, that burden includes showing whether the expressive conduct falls outside of the category of protected speech. [citation omitted] And to carry that burden, the government must generally point to historical evidence about the reach of the First Amendment’s protections.", "Roe and Casey thought that one-sided view misguided. In some sense, that is the difference in a nutshell between our precedents and the majority opinion. The constitutional regime we have lived in for the last 50 years recognized competing interests, and sought a balance between them. The constitutional regime we enter today erases the woman’s interest and recognizes only the State’s (or the Federal Government’s).\n\n \n\n \n\n \n\nB\n\nThe majority makes this change based on a single question: Did the reproductive right recognized in Roe and Casey exist in “1868, the year when the Fourteenth Amendment was ratified”? Ante, at 2252 – 2253. The majority says (and with this much we agree) that the answer to this question is no: In 1868, there was no nationwide right to end a pregnancy, and no thought that the Fourteenth Amendment provided one.\n\n \n\nOf course, the majority opinion refers as well to some later and earlier history. On the one side of 1868, it goes back as far as the 13th (the 13th!) century. See ante, at 2249, 142 S.Ct. 2111. But that turns out to be wheel-spinning. First, it is not clear what relevance *2324 such early history should have, even to the majority. See New York State Rifle & Pistol Assn., Inc. v.Bruen, 597 U.S. ––––, ––––, 142 S.Ct. 2111, 2136, ––– L.Ed.2d –––– (2022) (“Historical evidence that long predates [ratification] may not illuminate the scope of the right”). If the early history obviously supported abortion rights, the majority would no doubt say that only the views of the Fourteenth Amendment’s ratifiers are germane. See ibid. (It is “better not to go too far back into antiquity,” except if olden “law survived to become our Founders’ law”). Second—and embarrassingly for the majority—early law in fact does provide some support for abortion rights. Common-law authorities did not treat abortion as a crime before “quickening”—the point when the fetus moved in the womb.2 And early American law followed the common-law rule.3 So the criminal law of that early time might be taken as roughly consonant with Roe’s and Casey’s different treatment of early and late abortions. Better, then, to move forward in time. On the other side of 1868, the majority occasionally notes that many States barred abortion up to the time of Roe. See ante, at 2253, 2260, 142 S.Ct. 2111. That is convenient for the majority, but it is window dressing. As the same majority (plus one) just informed us, “post-ratification adoption or acceptance of laws that are inconsistent with the original meaning of the constitutional text obviously cannot overcome or alter that text.” New York State Rifle & Pistol Assn., Inc., 597 U.S., at –––– – ––––, 142 S.Ct., at 2137. Had the pre-Roe liberalization of abortion laws occurred more quickly and more widely in the 20th century, the majority would say (once again) that only the ratifiers’ views are germane.\n\n \n\nThe majority’s core legal postulate, then, is that we in the 21st century must read the Fourteenth Amendment just as its ratifiers did. And that is indeed what the majority emphasizes over and over again. See ante, at 2267 (“[T]he most important historical fact [is] how the States regulated abortion when the Fourteenth Amendment was adopted”); see also ante, at 2242 – 2243, 2248 – 2249, and n. 24, 23, 25, 28. If the ratifiers did not understand something as central to freedom, then neither can we. Or said more particularly: If those people did not understand reproductive rights as part of the guarantee of liberty conferred in the Fourteenth Amendment, then those rights do not exist.\n\n \n\nAs an initial matter, note a mistake in the just preceding sentence. We referred there to the “people” who ratified the Fourteenth Amendment: What rights did those “people” have in their heads at the time? But, of course, “people” did not ratify the Fourteenth Amendment. Men did. So it is perhaps not so surprising that the ratifiers were not perfectly attuned to the importance of reproductive rights for women’s liberty, or for their capacity to participate as equal members of our Nation."]}, {"source_sentence": "Based on the court's ruling, what are the implications of Title VII regarding discrimination against employees based on their transgender status or failure to conform to sex stereotypes?", "sentences": ["Thus, even if we agreed with the Funeral Home that Rost's religious exercise would be substantially burdened by enforcing Title VII in this case, we would nevertheless REVERSE the district court's grant of summary judgment to the Funeral Home and hold instead that requiring the Funeral Home to comply with Title VII constitutes the least restrictive means of furthering the government's compelling interest in eradicating discrimination against Stephens on the basis of sex. Thus, even assuming Rost's religious exercise is substantially burdened by the EEOC's enforcement action in this case, we GRANT summary judgment to the EEOC on the Funeral Home's RFRA defense on this alternative ground.\n\n \n\n[ … ]\n\n[ … ]\n\n \n\nIII. CONCLUSION\n\nDiscrimination against employees, either because of their failure to conform to sex stereotypes or their transgender and transitioning status, is illegal under Title VII. The unrefuted facts show that the Funeral Home fired Stephens because she refused to abide by her employer's stereotypical conception of her sex, and therefore the EEOC is entitled to summary judgment as to its unlawful-termination claim. RFRA provides the Funeral Home with no relief because continuing to employ Stephens would not, as a matter of law, substantially burden Rost's religious exercise, and even if it did, the EEOC has shown that enforcing Title VII here is the least restrictive means of furthering its compelling interest in combating and eradicating sex discrimination. We therefore REVERSE the district court's grant of summary judgment in favor of the Funeral Home and GRANT summary judgment to the EEOC on its unlawful-termination claim. We also REVERSE the district court's grant of summary judgment on the EEOC's discriminatory-clothing-allowance claim, as the district court erred in failing to consider the EEOC's claim on the merits. We REMAND this case to the district court for further proceedings consistent with this opinion.\n\n[1] We refer to Stephens using female pronouns, in accordance with the preference she has expressed through her briefing to this court.\n\n[2] All facts drawn from Def.'s Statement of Facts (R. 55) are undisputed. See R. 64 (Pl.'s Counter Statement of Disputed Facts) (Page ID #2066-88).\n\n[3] See also Appellee Br. at 16 (\"It is a helpful exercise to think about Price Waterhouse and imagine that there was a dress code imposed which obligated Ms. Hopkins to wear a skirt while her male colleagues were obliged to wear pants. Had she simply been fired for wearing pants rather than a skirt, the case would have ended there — both sexes would have been equally burdened by the requirement to comply with their respective sex-specific standard. But what the firm could not do was fire her for being aggressive or macho when it was tolerating or rewarding the behavior among men — and when it did, it relied on a stereotype to treat her disparately from the men in the firm.\").\n\n[4] Moreover, discrimination because of a person's transgender, intersex, or sexually indeterminate status is no less actionable than discrimination because of a person's identification with two religions, an unorthodox religion, or no religion at all. And \"religious identity\" can be just as fluid, variable, and difficult to define as \"gender identity\"; after all, both have \"a deeply personal, internal genesis that lacks a fixed external referent.\" Sue Landsittel, Strange Bedfellows? Sex, Religion, and Transgender Identity Under Title VII, 104 NW. U. L. REV. 1147, 1172 (2010) (advocating for \"[t]he application of tests for religious identity to the problem of gender identity [because it] produces a more realistic, and therefore more appropriate, authentication framework than the current reliance on medical diagnoses and conformity with the gender binary\").\n\n[5] On the other hand, there is also evidence that Stephens was fired only because of her nonconforming appearance and behavior at work, and not because of her transgender identity. See R. 53-6 (Rost Dep.", "Such laws would furnish the readiest means of compulsion. The 13th *244 Amendment prohibits involuntary servitude except as punishment for crime. But the exception, allowing full latitude for the enforcement of penal laws, does not destroy the prohibition. It does not permit slavery or involuntary servitude to be established or maintained through the operation of the criminal law by making it a crime to refuse to submit to the one or to render the service which would constitute the other. The state may impose involuntary servitude as a punishment for crime, but it may not compel one man to labor for another in payment of a debt, by punishing him as a criminal if he does not perform the service or pay the debt.\n\nIf the statute in this case had authorized the employing company to seize the debtor, and hold him to the service until he paid the $15, or had furnished the equivalent in labor, its invalidity would not be questioned. It would be equally clear that the state could not authorize its constabulary to prevent the servant from escaping, and to force him to work out his debt. But the state could not avail itself of the sanction of the criminal law to supply the compulsion any more than it could use or authorize the use of physical force. ‘In contemplation of the law, the compulsion to such service by the fear of punishment under a criminal statute is more powerful than any guard which the employer could station.’ Ex parte Hollman, 79 S. C. 22, 21 L.R.A.(N.S.) 249, 60 S. E. p. 24, 14 A. & E. Ann. Cas. 1109.\n\n**153 What the state may not do directly it may not do indirectly. If it cannot punish the servant as a criminal for the mere failure or refusal to serve without paying his debt, it is not permitted to accomplish the same result by creating a statutory presumption which, upon proof of no other fact, exposes him to conviction and punishment. Without imputing any actual motive to oppress, we must consider the natural operation of the statute here in question (Henderson v. New York [Henderson v. Wickham] 92 U. S. p. 268, 23 L. ed. 547), and it is apparent that it furnishes a convenient instrument for the coercion *245 which the Constitution and the act of Congress forbid; an instrument of compulsion peculiarly effective as against the poor and the ignorant, its most likely victims. There is no more important concern than to safeguard the freedom of labor upon which alone can enduring prosperity be based. The provision designed to secure it would soon become a barren form if it were possible to establish a statutory presumption of this sort, and to hold over the heads of laborers the threat of punishment for crime, under the name of fraud, but merely upon evidence of failure to work out their debts. The act of Congress deprives of effect all legislative measures of any state through which, directly or indirectly, the prohibited thing, to wit, compulsory service to secure the payment of a debt, may be established or maintained; and we conclude that § 4730, as amended, of the Code of Alabama, in so far as it makes the refusal or failure to perform the act or service, without refunding the money or paying for the property prima facie evidence of the commission received of the crime which the section defines, is in conflict with the 13th Amendment, and the legislation authorized by that Amendment, and is therefore invalid.\n\nIn this view it is unnecessary to consider the contentions which have been made under the 14th Amendment…\n\nReversed and cause remanded for further proceedings not inconsistent with this opinion.\n\nMr. Justice Holmes, dissenting [omitted]\n\n \n\n \n\n \n\n \n\n \n\n \n\n \n\n \n\n2.3\n\nJones v. Alfred H. Mayer Co.\n\n \n\n88 S.Ct. 2186\n\nSupreme Court of the United States\n\nJoseph Lee JONES et ux., Petitioners,\n\nv.\n\nALFRED H. MAYER CO. et al.\n\nNo. 645.\n\n|\n\nArgued April 1 and 2, 1968.\n\n|\n\nDecided June 17, 1968.\n\nSynopsis\n\nAction to recover damages and for injunctive relief because of refusal of defendants to sell home in private subdivision to plaintiffs solely because of race. The United States District Court for the Eastern District of Missouri, 255 F.Supp. 115, dismissed complaint, and plaintiffs appealed. The Court of Appeals for the Eighth Circuit, 379 F.2d 33, affirmed, and certiorari was granted. The United States Supreme Court, Mr.", "[citation omitted]\n\n \n\n*1994 The program imposes no geographic limitation: Parents may direct tuition payments to schools inside or outside the State, or even in foreign countries. [citation omitted] In schools that qualify for the program because they are accredited, teachers need not be certified by the State,…and Maine’s curricular requirements do not apply…Single-sex schools are eligible. [citation omitted]\n\n \n\nPrior to 1981, parents could also direct the tuition assistance payments to religious schools. Indeed, in the 1979–1980 school year, over 200 Maine students opted to attend such schools through the tuition assistance program. App. 72. In 1981, however, Maine imposed a new requirement that any school receiving tuition assistance payments must be “a nonsectarian school in accordance with the First Amendment of the United States Constitution.” [citation omitted] That provision was enacted in response to an opinion by the Maine attorney general taking the position that public funding of private religious schools violated the Establishment Clause of the First Amendment. We subsequently held, however, that a benefit program under which private citizens “direct government aid to religious schools wholly as a result of their own genuine and independent private choice” does not offend the Establishment Clause. [citation omitted] Following our decision in Zelman, the Maine Legislature considered a proposed bill to repeal the “nonsectarian” requirement, but rejected it. App. 100, 108.\n\n \n\nThe “nonsectarian” requirement for participation in Maine’s tuition assistance program remains in effect today. The Department has stated that, in administering this requirement, it “considers a sectarian school to be one that is associated with a particular faith or belief system and which, in addition to teaching academic subjects, promotes the faith or belief system with which it is associated and/or presents the material taught through the lens of this faith.” [citation omitted] “The Department’s focus is on what the school teaches through its curriculum and related activities, and how the material is presented.” …“[A]ffiliation or association with a church or religious institution is one potential indicator of a sectarian school,” but “it is not dispositive.”\n\n \n\n \n\nB\n\nThis case concerns two families that live in SAUs that neither maintain their own secondary schools nor contract with any nearby secondary school. App. 70, 71. Petitioners David and Amy Carson reside in Glenburn, Maine. Id., at 74. When this litigation commenced, the Carsons’ daughter attended high school at Bangor Christian Schools (BCS), which was founded in 1970 as a ministry of Bangor Baptist Church. Id., at 74, 80. The Carsons sent their daughter to BCS because of the school’s high academic standards and because the school’s Christian worldview aligns with their sincerely held religious beliefs. Id., at 74. Given that BCS is a “sectarian” school that cannot qualify for tuition assistance payments under Maine’s program, id., at 80, the Carsons paid the tuition for their daughter to attend BCS themselves, id., at 74.\n\n \n\nPetitioners Troy and Angela Nelson live in Palermo, Maine. Id., at 78. When this litigation commenced, the Nelsons’ daughter attended high school at Erskine Academy, a secular private school, and their son attended middle school at Temple Academy, a “sectarian” school affiliated with *1995 Centerpoint Community Church. Id., at 78, 90, 91. The Nelsons sent their son to Temple Academy because they believed it offered him a high-quality education that aligned with their sincerely held religious beliefs. Id., at 78. While they wished to send their daughter to Temple Academy too, they could not afford to pay the cost of the Academy’s tuition for both of their children. Id., at 79.\n\n \n\nBCS and Temple Academy are both accredited by the New England Association of Schools and Colleges (NEASC), and the Department considers each school a “private school approved for attendance purposes” under the State’s compulsory attendance requirement. Id., at 80, 90. Yet because neither school qualifies as “nonsectarian,” neither is eligible to receive tuition payments under Maine’s tuition assistance program. Id., at 80, 90. Absent the “nonsectarian” requirement, the Carsons and the Nelsons would have asked their respective SAUs to pay the tuition to send their children to BCS and Temple Academy, respectively. Id., at 79.\n\n \n\nIn 2018, petitioners brought suit against the commissioner of the Maine Department of Education. Id., at 11–12."]}], "model-index": [{"name": "ModernBERT Embed base LegalTextAI Matryoshka", "results": [{"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "dim 768", "type": "dim_768"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.4838709677419355, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.6989247311827957, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.7956989247311828, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.9247311827956989, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.4838709677419355, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.37992831541218625, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.2838709677419354, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.17204301075268813, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.21774193548387094, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.4883512544802867, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.5882616487455197, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.7087813620071685, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.5864023588218451, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.5962578938385393, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.49158210371757605, "name": "Cosine Map@100"}]}, {"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "dim 512", "type": "dim_512"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.4838709677419355, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.7204301075268817, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.7849462365591398, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.9032258064516129, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.4838709677419355, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.3870967741935483, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.286021505376344, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.1677419354838709, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.22311827956989244, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.5026881720430108, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.5936379928315412, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.6944444444444444, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.5845266760205443, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.5949906127325485, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.4986982754839258, "name": "Cosine Map@100"}]}, {"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "dim 256", "type": "dim_256"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.45161290322580644, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.6881720430107527, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.7956989247311828, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.8817204301075269, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.45161290322580644, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.36559139784946226, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.27956989247311825, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.16559139784946234, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.20878136200716843, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.471774193548387, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.5806451612903226, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.6854838709677419, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.5650385704476973, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.5673792456050522, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.47608804104449853, "name": "Cosine Map@100"}]}, {"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "dim 128", "type": "dim_128"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.44086021505376344, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.6451612903225806, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.7634408602150538, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.8387096774193549, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.44086021505376344, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.3548387096774194, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.27311827956989243, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.15591397849462363, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.1872759856630824, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.44534050179211476, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.5725806451612904, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.654121863799283, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.5356361930824536, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.5453490356716165, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.45106439048323554, "name": "Cosine Map@100"}]}, {"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "dim 64", "type": "dim_64"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.3978494623655914, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.6021505376344086, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.7096774193548387, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.8064516129032258, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.3978494623655914, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.34050179211469533, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.26021505376344084, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.153763440860215, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.1586021505376344, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.4059139784946236, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.5259856630824372, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.6164874551971326, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.5019311887697538, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.5081626557433011, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.4181782323905875, "name": "Cosine Map@100"}]}]}]} |
croissantllm/base_35k | croissantllm | text2text-generation | [
"transformers",
"pytorch",
"llama",
"text-generation",
"legal",
"code",
"text-generation-inference",
"art",
"text2text-generation",
"fr",
"en",
"dataset:cerebras/SlimPajama-627B",
"dataset:uonlp/CulturaX",
"dataset:pg19",
"dataset:bigcode/starcoderdata",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| 2024-01-18T13:40:48 | 2024-02-01T15:56:38 | 5 | 0 | ---
datasets:
- cerebras/SlimPajama-627B
- uonlp/CulturaX
- pg19
- bigcode/starcoderdata
language:
- fr
- en
license: mit
pipeline_tag: text2text-generation
tags:
- legal
- code
- text-generation-inference
- art
---
# CroissantLLM - Base (35k steps)
This model is part of the CroissantLLM initiative, and corresponds to the checkpoint after 35k steps (0.55 T) tokens.
To play with the final model, we recommend using the Chat version: https://huggingface.co/croissantllm/CroissantLLMChat-v0.1.
## Abstract
We introduce CroissantLLM, a 1.3B language model pretrained on a set of 3T English and French tokens, to bring to the research and industrial community a high-performance, fully open-sourced bilingual model that runs swiftly on consumer-grade local hardware.
To that end, we pioneer the approach of training an intrinsically bilingual model with a 1:1 English-to-French pretraining data ratio, a custom tokenizer, and bilingual finetuning datasets. We release the training dataset, notably containing a French split with manually curated, high-quality, and varied data sources.
To assess performance outside of English, we craft a novel benchmark, FrenchBench, consisting of an array of classification and generation tasks, covering various orthogonal aspects of model performance in the French Language. Additionally, rooted in transparency and to foster further Large Language Model research, we release codebases, and dozens of checkpoints across various model sizes, training data distributions, and training steps, as well as fine-tuned Chat models, and strong translation models. We evaluate our model through the FMTI framework, and validate 81% of the transparency criteria, far beyond the scores of even most open initiatives.
This work enriches the NLP landscape, breaking away from previous English-centric work in order to strengthen our understanding of multilinguality in language models.
## Citation
Our work can be cited as:
```bash
Coming soon
```
## Usage
This model is a base model, that is, it is not finetuned for Chat function and works best with few-shot prompting strategies.
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "croissantllm/base_35k"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.float16, device_map="auto")
inputs = tokenizer("I am so tired I could sleep right now. -> Je suis si fatigué que je pourrais m'endormir maintenant.
He is heading to the market. -> Il va au marché.
We are running on the beach. ->", return_tensors="pt").to(model.device)
tokens = model.generate(**inputs, max_length=100, do_sample=True, top_p=0.95, top_k=60, temperature=0.5)
print(tokenizer.decode(tokens[0]))
# remove bos token
inputs = tokenizer("Capitales: France -> Paris, Italie -> Rome, Allemagne -> Berlin, Espagne ->", return_tensors="pt", add_special_tokens=True).to(model.device)
tokens = model.generate(**inputs, max_length=100, do_sample=True, top_p=0.95, top_k=60)
print(tokenizer.decode(tokens[0]))
```
| [
"TRANSLATION"
]
| [
"CRAFT"
]
| Non_BioNLP |
# CroissantLLM - Base (35k steps)
This model is part of the CroissantLLM initiative, and corresponds to the checkpoint after 35k steps (0.55 T) tokens.
To play with the final model, we recommend using the Chat version: https://huggingface.co/croissantllm/CroissantLLMChat-v0.1.
## Abstract
We introduce CroissantLLM, a 1.3B language model pretrained on a set of 3T English and French tokens, to bring to the research and industrial community a high-performance, fully open-sourced bilingual model that runs swiftly on consumer-grade local hardware.
To that end, we pioneer the approach of training an intrinsically bilingual model with a 1:1 English-to-French pretraining data ratio, a custom tokenizer, and bilingual finetuning datasets. We release the training dataset, notably containing a French split with manually curated, high-quality, and varied data sources.
To assess performance outside of English, we craft a novel benchmark, FrenchBench, consisting of an array of classification and generation tasks, covering various orthogonal aspects of model performance in the French Language. Additionally, rooted in transparency and to foster further Large Language Model research, we release codebases, and dozens of checkpoints across various model sizes, training data distributions, and training steps, as well as fine-tuned Chat models, and strong translation models. We evaluate our model through the FMTI framework, and validate 81% of the transparency criteria, far beyond the scores of even most open initiatives.
This work enriches the NLP landscape, breaking away from previous English-centric work in order to strengthen our understanding of multilinguality in language models.
## Citation
Our work can be cited as:
```bash
Coming soon
```
## Usage
This model is a base model, that is, it is not finetuned for Chat function and works best with few-shot prompting strategies.
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "croissantllm/base_35k"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.float16, device_map="auto")
inputs = tokenizer("I am so tired I could sleep right now. -> Je suis si fatigué que je pourrais m'endormir maintenant.
He is heading to the market. -> Il va au marché.
We are running on the beach. ->", return_tensors="pt").to(model.device)
tokens = model.generate(**inputs, max_length=100, do_sample=True, top_p=0.95, top_k=60, temperature=0.5)
print(tokenizer.decode(tokens[0]))
# remove bos token
inputs = tokenizer("Capitales: France -> Paris, Italie -> Rome, Allemagne -> Berlin, Espagne ->", return_tensors="pt", add_special_tokens=True).to(model.device)
tokens = model.generate(**inputs, max_length=100, do_sample=True, top_p=0.95, top_k=60)
print(tokenizer.decode(tokens[0]))
```
| {"datasets": ["cerebras/SlimPajama-627B", "uonlp/CulturaX", "pg19", "bigcode/starcoderdata"], "language": ["fr", "en"], "license": "mit", "pipeline_tag": "text2text-generation", "tags": ["legal", "code", "text-generation-inference", "art"]} |
serdarcaglar/roberta-base-biomedical-es | serdarcaglar | fill-mask | [
"transformers",
"pytorch",
"safetensors",
"roberta",
"fill-mask",
"es",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| 2023-09-09T13:27:39 | 2023-09-19T21:09:48 | 46 | 1 | ---
language:
- es
---
language:
- es
tags:
- biomedical
- spanish
metrics:
- ppl
# Biomedical language model for Spanish
## Table of contents
<details>
<summary>Click to expand</summary>
- [Model description](#model-description)
- [Intended uses and limitations](#intended-use)
- [How to use](#how-to-use)
- [Limitations and bias](#limitations-and-bias)
- [Training](#training)
- [Tokenization and model pretraining](#Tokenization-pretraining)
- [Training corpora and preprocessing](#training-corpora-preprocessing)
- [Evaluation](#evaluation)
- [Additional information](#additional-information)
- [Author](#author)
- [Contact information](#contact-information)
- [Copyright](#copyright)
- [Licensing information](#licensing-information)
- [Funding](#funding)
- [Disclaimer](#disclaimer)
</details>
## Model description
Biomedical pretrained language model for Spanish.
## Intended uses and limitations
The model is ready-to-use only for masked language modelling to perform the Fill Mask task (try the inference API or read the next section). However, it is intended to be fine-tuned on downstream tasks such as Named Entity Recognition or Text Classification.
## How to use
```python
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("serdarcaglar/roberta-base-biomedical-es")
model = AutoModelForMaskedLM.from_pretrained("serdarcaglar/roberta-base-biomedical-es")
from transformers import pipeline
unmasker = pipeline('fill-mask', model="serdarcaglar/roberta-base-biomedical-es")
unmasker("El único antecedente personal a reseñar era la <mask> arterial.")
```
```
```
## Training
### Tokenization and model pretraining
This model is a [RoBERTa-based](https://github.com/pytorch/fairseq/tree/master/examples/roberta) model trained on a
**biomedical** corpus in Spanish collected from several sources
- medprocner
- codiesp
- emea
- wmt19
- wmt16
- wmt22
- scielo
- ibecs
- elrc datsets
The training corpus has been tokenized using a byte version of [Byte-Pair Encoding (BPE)](https://github.com/openai/gpt-2)
used in the original [RoBERTA](https://github.com/pytorch/fairseq/tree/master/examples/roberta) model with a vocabulary size of 52,000 tokens. The pretraining consists of a masked language model training at the subword level following the approach employed for the RoBERTa base model with the same hyperparameters as in the original work.
### Training corpora and preprocessing
The training corpus is composed of several biomedical corpora in Spanish, collected from publicly available corpora and crawlers.
To obtain a high-quality training corpus, a cleaning pipeline with the following operations has been applied:
- data parsing in different formats
- sentence splitting
- language detection
- filtering of ill-formed sentences
- deduplication of repetitive contents
- keep the original document boundaries
Finally, the corpora are concatenated and further global deduplication among the corpora have been applied.
## Evaluation
The model has been evaluated on the Named Entity Recognition (NER) using the following datasets:
Perplexity: 3.09
Please share the results you get in the NER task using this model. I can add them here.
## Additional information
### Author
Serdar ÇAĞLAR
### Contact information
Linkedin: <https://www.linkedin.com/in/serdarildercaglar/>
For further information, send an email to <[email protected]>
### Licensing information
[Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0)
### Disclaimer
<details>
<summary>Click to expand</summary>
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.
In no event shall the owner of the models be liable for any results arising from the use made by third parties of these models.
Bu havuzda yayınlanan modeller genel bir amaca yöneliktir ve üçüncü tarafların kullanımına açıktır. Bu modellerde önyargı ve diğer istenmeyen çarpıklıklar olabilir.
Üçüncü taraflar, bu modellerden herhangi birini kullanarak (veya bu modellere dayalı sistemleri kullanarak) diğer taraflara sistem ve/veya hizmet sağladıklarında veya modellerin kullanıcısı olduklarında, bunların kullanımından kaynaklanan riskleri azaltmanın ve her durumda Yapay Zeka kullanımına ilişkin düzenlemeler de dahil olmak üzere geçerli düzenlemelere uymanın kendi sorumluluklarında olduğunu unutmamalıdırlar.
Modellerin sahibi hiçbir durumda bu modellerin üçüncü şahıslar tarafından kullanımından kaynaklanan sonuçlardan sorumlu tutulamaz.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y otras distorsiones indeseables.
Cuando terceras partes, desplieguen o proporcionen sistemas y/o servicios a otras partes utilizando cualquiera de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluida la normativa relativa al uso de Inteligencia Artificial.
En ningún caso el propietario de los modelos será responsable de los resultados derivados del uso que terceros hagan de los mismos.
</details> | [
"NAMED_ENTITY_RECOGNITION",
"TEXT_CLASSIFICATION"
]
| [
"CODIESP",
"SCIELO"
]
| BioNLP | language:
- es
tags:
- biomedical
- spanish
metrics:
- ppl
# Biomedical language model for Spanish
## Table of contents
<details>
<summary>Click to expand</summary>
- [Model description](#model-description)
- [Intended uses and limitations](#intended-use)
- [How to use](#how-to-use)
- [Limitations and bias](#limitations-and-bias)
- [Training](#training)
- [Tokenization and model pretraining](#Tokenization-pretraining)
- [Training corpora and preprocessing](#training-corpora-preprocessing)
- [Evaluation](#evaluation)
- [Additional information](#additional-information)
- [Author](#author)
- [Contact information](#contact-information)
- [Copyright](#copyright)
- [Licensing information](#licensing-information)
- [Funding](#funding)
- [Disclaimer](#disclaimer)
</details>
## Model description
Biomedical pretrained language model for Spanish.
## Intended uses and limitations
The model is ready-to-use only for masked language modelling to perform the Fill Mask task (try the inference API or read the next section). However, it is intended to be fine-tuned on downstream tasks such as Named Entity Recognition or Text Classification.
## How to use
```python
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("serdarcaglar/roberta-base-biomedical-es")
model = AutoModelForMaskedLM.from_pretrained("serdarcaglar/roberta-base-biomedical-es")
from transformers import pipeline
unmasker = pipeline('fill-mask', model="serdarcaglar/roberta-base-biomedical-es")
unmasker("El único antecedente personal a reseñar era la <mask> arterial.")
```
```
```
## Training
### Tokenization and model pretraining
This model is a [RoBERTa-based](https://github.com/pytorch/fairseq/tree/master/examples/roberta) model trained on a
**biomedical** corpus in Spanish collected from several sources
- medprocner
- codiesp
- emea
- wmt19
- wmt16
- wmt22
- scielo
- ibecs
- elrc datsets
The training corpus has been tokenized using a byte version of [Byte-Pair Encoding (BPE)](https://github.com/openai/gpt-2)
used in the original [RoBERTA](https://github.com/pytorch/fairseq/tree/master/examples/roberta) model with a vocabulary size of 52,000 tokens. The pretraining consists of a masked language model training at the subword level following the approach employed for the RoBERTa base model with the same hyperparameters as in the original work.
### Training corpora and preprocessing
The training corpus is composed of several biomedical corpora in Spanish, collected from publicly available corpora and crawlers.
To obtain a high-quality training corpus, a cleaning pipeline with the following operations has been applied:
- data parsing in different formats
- sentence splitting
- language detection
- filtering of ill-formed sentences
- deduplication of repetitive contents
- keep the original document boundaries
Finally, the corpora are concatenated and further global deduplication among the corpora have been applied.
## Evaluation
The model has been evaluated on the Named Entity Recognition (NER) using the following datasets:
Perplexity: 3.09
Please share the results you get in the NER task using this model. I can add them here.
## Additional information
### Author
Serdar ÇAĞLAR
### Contact information
Linkedin: <https://www.linkedin.com/in/serdarildercaglar/>
For further information, send an email to <[email protected]>
### Licensing information
[Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0)
### Disclaimer
<details>
<summary>Click to expand</summary>
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.
In no event shall the owner of the models be liable for any results arising from the use made by third parties of these models.
Bu havuzda yayınlanan modeller genel bir amaca yöneliktir ve üçüncü tarafların kullanımına açıktır. Bu modellerde önyargı ve diğer istenmeyen çarpıklıklar olabilir.
Üçüncü taraflar, bu modellerden herhangi birini kullanarak (veya bu modellere dayalı sistemleri kullanarak) diğer taraflara sistem ve/veya hizmet sağladıklarında veya modellerin kullanıcısı olduklarında, bunların kullanımından kaynaklanan riskleri azaltmanın ve her durumda Yapay Zeka kullanımına ilişkin düzenlemeler de dahil olmak üzere geçerli düzenlemelere uymanın kendi sorumluluklarında olduğunu unutmamalıdırlar.
Modellerin sahibi hiçbir durumda bu modellerin üçüncü şahıslar tarafından kullanımından kaynaklanan sonuçlardan sorumlu tutulamaz.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y otras distorsiones indeseables.
Cuando terceras partes, desplieguen o proporcionen sistemas y/o servicios a otras partes utilizando cualquiera de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluida la normativa relativa al uso de Inteligencia Artificial.
En ningún caso el propietario de los modelos será responsable de los resultados derivados del uso que terceros hagan de los mismos.
</details> | {"language": ["es"]} |
sultan/BioM-ALBERT-xxlarge-PMC | sultan | fill-mask | [
"transformers",
"pytorch",
"albert",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| 2022-03-02T23:29:05 | 2023-11-04T23:06:21 | 534 | 4 | ---
{}
---
# BioM-Transformers: Building Large Biomedical Language Models with BERT, ALBERT and ELECTRA
# Abstract
The impact of design choices on the performance
of biomedical language models recently
has been a subject for investigation. In
this paper, we empirically study biomedical
domain adaptation with large transformer models
using different design choices. We evaluate
the performance of our pretrained models
against other existing biomedical language
models in the literature. Our results show that
we achieve state-of-the-art results on several
biomedical domain tasks despite using similar
or less computational cost compared to other
models in the literature. Our findings highlight
the significant effect of design choices on
improving the performance of biomedical language
models.
# Model Description
This model was pre-trained on PMC full article for further 64k steps with a batch size of 8192, where we initiate our weights from our model BioM-ALBERT-xxlarge. Thus, the total training steps for this model is 264k+64K=328K steps. The model is very large due to the number of hidden layer size (4096). In order to help researchers with limited resources to fine-tune larger models, we created an example with PyTorch XLA. PyTorch XLA (https://github.com/pytorch/xla) is a library that allows you to use PyTorch on TPU units, which is provided for free by Google Colab and Kaggle. Follow this example to work with PyTorch/XLA [Link](https://github.com/salrowili/BioM-Transformers/blob/main/examples/Fine_Tuning_Biomedical_Models_on_Text_Classification_Task_With_HuggingFace_Transformers_and_PyTorch_XLA.ipynb). In this example we achieve 80.74 micro F1 score on ChemProt task with BioM-ALBERTxxlarge . Fine-tuning takes 43 minutes for 5 epochs .
Check our GitHub repo at https://github.com/salrowili/BioM-Transformers for TensorFlow and GluonNLP checkpoints. We also updated this repo with a couple of examples on how to fine-tune LMs on text classification and questions answering tasks such as ChemProt, SQuAD, and BioASQ.
# Colab Notebook Examples
BioM-ELECTRA-LARGE on NER and ChemProt Task [![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/BioM-Transformers/blob/main/examples/Example_of_NER_and_ChemProt_Task_on_TPU.ipynb)
BioM-ELECTRA-Large on SQuAD2.0 and BioASQ7B Factoid tasks [![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/BioM-Transformers/blob/main/examples/Example_of_SQuAD2_0_and_BioASQ7B_tasks_with_BioM_ELECTRA_Large_on_TPU.ipynb)
BioM-ALBERT-xxlarge on SQuAD2.0 and BioASQ7B Factoid tasks [![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/BioM-Transformers/blob/main/examples/Example_of_SQuAD2_0_and_BioASQ7B_tasks_with_BioM_ALBERT_xxlarge_on_TPU.ipynb)
Text Classification Task With HuggingFace Transformers and PyTorchXLA on Free TPU [![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/BioM-Transformers/blob/main/examples/Fine_Tuning_Biomedical_Models_on_Text_Classification_Task_With_HuggingFace_Transformers_and_PyTorch_XLA.ipynb)
Reproducing our BLURB results with JAX [![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/BioM-Transformers/blob/main/examples/BLURB_LeaderBoard_with_TPU_VM.ipynb)
Finetunning BioM-Transformers with Jax/Flax on TPUv3-8 with free Kaggle resource [![Open In Colab][COLAB]](https://www.kaggle.com/code/sultanalrowili/biom-transoformers-with-flax-on-tpu-with-kaggle)
[COLAB]: https://colab.research.google.com/assets/colab-badge.svg
# Acknowledgment
We would like to acknowledge the support we have from Tensorflow Research Cloud (TFRC) team to grant us access to TPUv3 units.
# Citation
```bibtex
@inproceedings{alrowili-shanker-2021-biom,
title = "{B}io{M}-Transformers: Building Large Biomedical Language Models with {BERT}, {ALBERT} and {ELECTRA}",
author = "Alrowili, Sultan and
Shanker, Vijay",
booktitle = "Proceedings of the 20th Workshop on Biomedical Language Processing",
month = jun,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2021.bionlp-1.24",
pages = "221--227",
abstract = "The impact of design choices on the performance of biomedical language models recently has been a subject for investigation. In this paper, we empirically study biomedical domain adaptation with large transformer models using different design choices. We evaluate the performance of our pretrained models against other existing biomedical language models in the literature. Our results show that we achieve state-of-the-art results on several biomedical domain tasks despite using similar or less computational cost compared to other models in the literature. Our findings highlight the significant effect of design choices on improving the performance of biomedical language models.",
}
``` | [
"TEXT_CLASSIFICATION"
]
| [
"BLURB",
"CHEMPROT"
]
| BioNLP | # BioM-Transformers: Building Large Biomedical Language Models with BERT, ALBERT and ELECTRA
# Abstract
The impact of design choices on the performance
of biomedical language models recently
has been a subject for investigation. In
this paper, we empirically study biomedical
domain adaptation with large transformer models
using different design choices. We evaluate
the performance of our pretrained models
against other existing biomedical language
models in the literature. Our results show that
we achieve state-of-the-art results on several
biomedical domain tasks despite using similar
or less computational cost compared to other
models in the literature. Our findings highlight
the significant effect of design choices on
improving the performance of biomedical language
models.
# Model Description
This model was pre-trained on PMC full article for further 64k steps with a batch size of 8192, where we initiate our weights from our model BioM-ALBERT-xxlarge. Thus, the total training steps for this model is 264k+64K=328K steps. The model is very large due to the number of hidden layer size (4096). In order to help researchers with limited resources to fine-tune larger models, we created an example with PyTorch XLA. PyTorch XLA (https://github.com/pytorch/xla) is a library that allows you to use PyTorch on TPU units, which is provided for free by Google Colab and Kaggle. Follow this example to work with PyTorch/XLA [Link](https://github.com/salrowili/BioM-Transformers/blob/main/examples/Fine_Tuning_Biomedical_Models_on_Text_Classification_Task_With_HuggingFace_Transformers_and_PyTorch_XLA.ipynb). In this example we achieve 80.74 micro F1 score on ChemProt task with BioM-ALBERTxxlarge . Fine-tuning takes 43 minutes for 5 epochs .
Check our GitHub repo at https://github.com/salrowili/BioM-Transformers for TensorFlow and GluonNLP checkpoints. We also updated this repo with a couple of examples on how to fine-tune LMs on text classification and questions answering tasks such as ChemProt, SQuAD, and BioASQ.
# Colab Notebook Examples
BioM-ELECTRA-LARGE on NER and ChemProt Task [![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/BioM-Transformers/blob/main/examples/Example_of_NER_and_ChemProt_Task_on_TPU.ipynb)
BioM-ELECTRA-Large on SQuAD2.0 and BioASQ7B Factoid tasks [![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/BioM-Transformers/blob/main/examples/Example_of_SQuAD2_0_and_BioASQ7B_tasks_with_BioM_ELECTRA_Large_on_TPU.ipynb)
BioM-ALBERT-xxlarge on SQuAD2.0 and BioASQ7B Factoid tasks [![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/BioM-Transformers/blob/main/examples/Example_of_SQuAD2_0_and_BioASQ7B_tasks_with_BioM_ALBERT_xxlarge_on_TPU.ipynb)
Text Classification Task With HuggingFace Transformers and PyTorchXLA on Free TPU [![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/BioM-Transformers/blob/main/examples/Fine_Tuning_Biomedical_Models_on_Text_Classification_Task_With_HuggingFace_Transformers_and_PyTorch_XLA.ipynb)
Reproducing our BLURB results with JAX [![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/BioM-Transformers/blob/main/examples/BLURB_LeaderBoard_with_TPU_VM.ipynb)
Finetunning BioM-Transformers with Jax/Flax on TPUv3-8 with free Kaggle resource [![Open In Colab][COLAB]](https://www.kaggle.com/code/sultanalrowili/biom-transoformers-with-flax-on-tpu-with-kaggle)
[COLAB]: https://colab.research.google.com/assets/colab-badge.svg
# Acknowledgment
We would like to acknowledge the support we have from Tensorflow Research Cloud (TFRC) team to grant us access to TPUv3 units.
# Citation
```bibtex
@inproceedings{alrowili-shanker-2021-biom,
title = "{B}io{M}-Transformers: Building Large Biomedical Language Models with {BERT}, {ALBERT} and {ELECTRA}",
author = "Alrowili, Sultan and
Shanker, Vijay",
booktitle = "Proceedings of the 20th Workshop on Biomedical Language Processing",
month = jun,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2021.bionlp-1.24",
pages = "221--227",
abstract = "The impact of design choices on the performance of biomedical language models recently has been a subject for investigation. In this paper, we empirically study biomedical domain adaptation with large transformer models using different design choices. We evaluate the performance of our pretrained models against other existing biomedical language models in the literature. Our results show that we achieve state-of-the-art results on several biomedical domain tasks despite using similar or less computational cost compared to other models in the literature. Our findings highlight the significant effect of design choices on improving the performance of biomedical language models.",
}
``` | {} |
clinicalnlplab/me-llama | clinicalnlplab | null | [
"transformers",
"medical",
"health",
"llama",
"llama2",
"en",
"dataset:togethercomputer/RedPajama-Data-1T",
"dataset:bigbio/med_qa",
"arxiv:2402.12749",
"license:llama2",
"endpoints_compatible",
"region:us"
]
| 2024-06-10T04:02:12 | 2024-06-10T04:14:29 | 0 | 12 | ---
datasets:
- togethercomputer/RedPajama-Data-1T
- bigbio/med_qa
language:
- en
library_name: transformers
license: llama2
tags:
- medical
- health
- llama
- llama2
---
# Me-LLaMA
## Model Overview
The Me-LLaMA model consists of two foundation models: Me-LLaMA 13B and Me-LLaMA 70B, along with their chat-enhanced counterparts, Me-LLaMA 13B-chat and Me-LLaMA 70B-chat. These models are designed for superior chat and instruction-following capabilities. The Me-LLaMA 13B and 70B were continually pretrained from the base LLaMA 2 13B and 70B models with the addition of biomedical, clinical, and general domain data. The chat versions were further instruction-tuned using comprehensive medical instruction tuning data.
## Pretraining and Data
Me-LLaMA was developed through continual pre-training and instruction tuning of LLaMA2, incorporating 129B tokens and 214K instruction tuning samples from general, biomedical, and clinical domains. The pretraining data consists of biomedical literature, clinical notes, and general domain data in a 15:1:4 ratio, sourced from:
- **Biomedical:** PubMed Central and PubMed Abstracts (Pile dataset)
- **Clinical:** De-identified free-text clinical notes from MIMIC III, MIMIC-IV, and MIMIC-CXR
- **General Domain:** Subset from the RedPajama dataset
The instruction tuning dataset includes:
- **General Domain:** Alpaca, Dolly, and ShareGPT datasets
- **Biomedical:** HealthCareMagic, Icliniq, MedInstruct, Medical Flash Cards, MEDIQA, MedicationQA, LiveQA, WikiDocPatient, Guideline QA, Pubmed Central, Pubmed, UMLS Knowledge graph
- **Clinical:** MIMIC-III and MIMIC-IV
## Evaluation
Me-LLaMA was evaluated on 12 datasets across different tasks:
- **QA:** PubMedQA, MedQA, MedMCQA, EmrQA
- **NER:** 2010 i2b2
- **Relation Extraction:** 2013 DDI
- **Classification:** HoC, MTSample
- **Text Summarization:** PubMed, MIMIC-CXR
- **NLI:** BioNLI, MedNLI
### Performance
- **Me-LLaMA 13B:** Surpassed PMC-LLaMA 13B on 11/12 datasets and LLaMA2 13B on 10/12 datasets, with competitive performance against larger models like LLaMA2 70B and Meditron 70B on 8/12 datasets.
- **Me-LLaMA 70B:** Outperformed LLaMA2 70B and Meditron 70B on 9/12 datasets.
- **Zero-shot setting:** Outperformed ChatGPT on 5/8 datasets without privacy concerns, and on 1/8 against GPT-4.
- **Task-specific instruction tuning:** Surpassed ChatGPT on 7/8 and GPT-4 on 5/8 datasets.
Despite having significantly fewer parameters (13B/70B vs. 175B+ for ChatGPT and GPT-4), Me-LLaMA models demonstrated impressive performance and strong abilities in supervised and in-context learning across various medical tasks.
## Model Details
Included in this repository are four models:
1. **Me-LLaMA 13B:** Continually pretrained from LLaMA 2 13B.
2. **Me-LLaMA 70B:** Continually pretrained from LLaMA 2 70B.
3. **Me-LLaMA 13B-chat:** Further instruction-tuned from Me-LLaMA 13B using a variety of general, biomedical, and clinical datasets.
4. **Me-LLaMA 70B-chat:** Further instruction-tuned from Me-LLaMA 70B using a variety of general, biomedical, and clinical datasets.
Each model contains several files, which are standard with the transformers library:
- **config.json:** Information about the model
- **model-x-of-y.safetensors:** Model weights
- **generation_config.json:** Settings for text generation
- **special_tokens_map.json:** Special tokens used in training
- **tokenizer.json:** Mapping from indices to tokens
- **tokenizer_config.json:** Configuration file for the tokenizer
## Usage
For more details and to access the models, please visit the [Me-LLaMA repository on PhysioNet](https://physionet.org/content/me-llama/1.0.0/).
For more technical details, please visit [our paper on arXiv](https://arxiv.org/abs/2402.12749).
| [
"RELATION_EXTRACTION",
"SUMMARIZATION"
]
| [
"MEDNLI",
"MEDQA",
"PUBMEDQA"
]
| BioNLP |
# Me-LLaMA
## Model Overview
The Me-LLaMA model consists of two foundation models: Me-LLaMA 13B and Me-LLaMA 70B, along with their chat-enhanced counterparts, Me-LLaMA 13B-chat and Me-LLaMA 70B-chat. These models are designed for superior chat and instruction-following capabilities. The Me-LLaMA 13B and 70B were continually pretrained from the base LLaMA 2 13B and 70B models with the addition of biomedical, clinical, and general domain data. The chat versions were further instruction-tuned using comprehensive medical instruction tuning data.
## Pretraining and Data
Me-LLaMA was developed through continual pre-training and instruction tuning of LLaMA2, incorporating 129B tokens and 214K instruction tuning samples from general, biomedical, and clinical domains. The pretraining data consists of biomedical literature, clinical notes, and general domain data in a 15:1:4 ratio, sourced from:
- **Biomedical:** PubMed Central and PubMed Abstracts (Pile dataset)
- **Clinical:** De-identified free-text clinical notes from MIMIC III, MIMIC-IV, and MIMIC-CXR
- **General Domain:** Subset from the RedPajama dataset
The instruction tuning dataset includes:
- **General Domain:** Alpaca, Dolly, and ShareGPT datasets
- **Biomedical:** HealthCareMagic, Icliniq, MedInstruct, Medical Flash Cards, MEDIQA, MedicationQA, LiveQA, WikiDocPatient, Guideline QA, Pubmed Central, Pubmed, UMLS Knowledge graph
- **Clinical:** MIMIC-III and MIMIC-IV
## Evaluation
Me-LLaMA was evaluated on 12 datasets across different tasks:
- **QA:** PubMedQA, MedQA, MedMCQA, EmrQA
- **NER:** 2010 i2b2
- **Relation Extraction:** 2013 DDI
- **Classification:** HoC, MTSample
- **Text Summarization:** PubMed, MIMIC-CXR
- **NLI:** BioNLI, MedNLI
### Performance
- **Me-LLaMA 13B:** Surpassed PMC-LLaMA 13B on 11/12 datasets and LLaMA2 13B on 10/12 datasets, with competitive performance against larger models like LLaMA2 70B and Meditron 70B on 8/12 datasets.
- **Me-LLaMA 70B:** Outperformed LLaMA2 70B and Meditron 70B on 9/12 datasets.
- **Zero-shot setting:** Outperformed ChatGPT on 5/8 datasets without privacy concerns, and on 1/8 against GPT-4.
- **Task-specific instruction tuning:** Surpassed ChatGPT on 7/8 and GPT-4 on 5/8 datasets.
Despite having significantly fewer parameters (13B/70B vs. 175B+ for ChatGPT and GPT-4), Me-LLaMA models demonstrated impressive performance and strong abilities in supervised and in-context learning across various medical tasks.
## Model Details
Included in this repository are four models:
1. **Me-LLaMA 13B:** Continually pretrained from LLaMA 2 13B.
2. **Me-LLaMA 70B:** Continually pretrained from LLaMA 2 70B.
3. **Me-LLaMA 13B-chat:** Further instruction-tuned from Me-LLaMA 13B using a variety of general, biomedical, and clinical datasets.
4. **Me-LLaMA 70B-chat:** Further instruction-tuned from Me-LLaMA 70B using a variety of general, biomedical, and clinical datasets.
Each model contains several files, which are standard with the transformers library:
- **config.json:** Information about the model
- **model-x-of-y.safetensors:** Model weights
- **generation_config.json:** Settings for text generation
- **special_tokens_map.json:** Special tokens used in training
- **tokenizer.json:** Mapping from indices to tokens
- **tokenizer_config.json:** Configuration file for the tokenizer
## Usage
For more details and to access the models, please visit the [Me-LLaMA repository on PhysioNet](https://physionet.org/content/me-llama/1.0.0/).
For more technical details, please visit [our paper on arXiv](https://arxiv.org/abs/2402.12749).
| {"datasets": ["togethercomputer/RedPajama-Data-1T", "bigbio/med_qa"], "language": ["en"], "library_name": "transformers", "license": "llama2", "tags": ["medical", "health", "llama", "llama2"]} |
raghavlight/TDTE | raghavlight | null | [
"safetensors",
"mteb",
"model-index",
"region:us"
]
| 2024-06-13T00:41:23 | 2024-06-13T02:52:26 | 0 | 3 | ---
tags:
- mteb
model-index:
- name: 0523_mistralv2_sum3echo512_bbcc_8_16_16
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 79.65671641791045
- type: ap
value: 44.24063991266868
- type: f1
value: 73.91766997954294
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 94.480125
- type: ap
value: 92.21829806116952
- type: f1
value: 94.47801150800291
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 48.157999999999994
- type: f1
value: 47.11858175135973
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: mteb/arguana
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 31.935000000000002
- type: map_at_10
value: 49.482
- type: map_at_100
value: 49.482
- type: map_at_1000
value: 49.482
- type: map_at_20
value: 49.482
- type: map_at_3
value: 44.464
- type: map_at_5
value: 47.569
- type: mrr_at_1
value: 33.001000000000005
- type: mrr_at_10
value: 49.989
- type: mrr_at_100
value: 49.989
- type: mrr_at_1000
value: 49.989
- type: mrr_at_20
value: 49.989
- type: mrr_at_3
value: 44.903
- type: mrr_at_5
value: 48.054
- type: ndcg_at_1
value: 31.935000000000002
- type: ndcg_at_10
value: 58.819
- type: ndcg_at_100
value: 58.819
- type: ndcg_at_1000
value: 58.819
- type: ndcg_at_20
value: 58.819
- type: ndcg_at_3
value: 48.620000000000005
- type: ndcg_at_5
value: 54.230000000000004
- type: precision_at_1
value: 31.935000000000002
- type: precision_at_10
value: 8.841000000000001
- type: precision_at_100
value: 0.8840000000000001
- type: precision_at_1000
value: 0.08800000000000001
- type: precision_at_20
value: 4.42
- type: precision_at_3
value: 20.223
- type: precision_at_5
value: 14.865
- type: recall_at_1
value: 31.935000000000002
- type: recall_at_10
value: 88.407
- type: recall_at_100
value: 88.407
- type: recall_at_1000
value: 88.407
- type: recall_at_20
value: 88.407
- type: recall_at_3
value: 60.669
- type: recall_at_5
value: 74.324
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 48.7848435754835
- type: v_measures
value:
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 46.10665257880071
- type: v_measures
value:
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 66.7285956124022
- type: mrr
value: 79.72233214615486
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 88.73245869702066
- type: cos_sim_spearman
value: 87.28451895745819
- type: euclidean_pearson
value: 86.44569617089661
- type: euclidean_spearman
value: 86.7236628044763
- type: manhattan_pearson
value: 86.50853979799092
- type: manhattan_spearman
value: 86.75920578302187
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 88.91233766233766
- type: f1
value: 88.86315189747688
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 38.7850808112868
- type: v_measures
value:
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 37.37318034700008
- type: v_measures
value:
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: mteb/cqadupstack-android
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 39.232
- type: map_at_10
value: 53.04299999999999
- type: map_at_100
value: 53.04299999999999
- type: map_at_1000
value: 53.04299999999999
- type: map_at_20
value: 53.04299999999999
- type: map_at_3
value: 48.588
- type: map_at_5
value: 51.17699999999999
- type: mrr_at_1
value: 49.356
- type: mrr_at_10
value: 59.550000000000004
- type: mrr_at_100
value: 59.550000000000004
- type: mrr_at_1000
value: 59.550000000000004
- type: mrr_at_20
value: 59.550000000000004
- type: mrr_at_3
value: 56.986000000000004
- type: mrr_at_5
value: 58.638999999999996
- type: ndcg_at_1
value: 49.356
- type: ndcg_at_10
value: 60.156
- type: ndcg_at_100
value: 59.714999999999996
- type: ndcg_at_1000
value: 59.699000000000005
- type: ndcg_at_20
value: 59.831
- type: ndcg_at_3
value: 54.75299999999999
- type: ndcg_at_5
value: 57.443999999999996
- type: precision_at_1
value: 49.356
- type: precision_at_10
value: 11.86
- type: precision_at_100
value: 1.1860000000000002
- type: precision_at_1000
value: 0.11900000000000001
- type: precision_at_20
value: 5.93
- type: precision_at_3
value: 26.895999999999997
- type: precision_at_5
value: 19.570999999999998
- type: recall_at_1
value: 39.232
- type: recall_at_10
value: 72.98400000000001
- type: recall_at_100
value: 72.98400000000001
- type: recall_at_1000
value: 72.98400000000001
- type: recall_at_20
value: 72.98400000000001
- type: recall_at_3
value: 56.213
- type: recall_at_5
value: 64.318
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: mteb/cqadupstack-english
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 37.157000000000004
- type: map_at_10
value: 49.512
- type: map_at_100
value: 49.512
- type: map_at_1000
value: 49.512
- type: map_at_20
value: 49.512
- type: map_at_3
value: 46.099000000000004
- type: map_at_5
value: 48.061
- type: mrr_at_1
value: 47.516000000000005
- type: mrr_at_10
value: 55.803999999999995
- type: mrr_at_100
value: 55.803999999999995
- type: mrr_at_1000
value: 55.803999999999995
- type: mrr_at_20
value: 55.803999999999995
- type: mrr_at_3
value: 53.885000000000005
- type: mrr_at_5
value: 54.967999999999996
- type: ndcg_at_1
value: 47.516000000000005
- type: ndcg_at_10
value: 55.386
- type: ndcg_at_100
value: 54.952
- type: ndcg_at_1000
value: 54.952
- type: ndcg_at_20
value: 55.07300000000001
- type: ndcg_at_3
value: 51.458000000000006
- type: ndcg_at_5
value: 53.189
- type: precision_at_1
value: 47.516000000000005
- type: precision_at_10
value: 10.567
- type: precision_at_100
value: 1.057
- type: precision_at_1000
value: 0.106
- type: precision_at_20
value: 5.283
- type: precision_at_3
value: 25.393
- type: precision_at_5
value: 17.656
- type: recall_at_1
value: 37.157000000000004
- type: recall_at_10
value: 65.026
- type: recall_at_100
value: 65.026
- type: recall_at_1000
value: 65.026
- type: recall_at_20
value: 65.026
- type: recall_at_3
value: 52.36300000000001
- type: recall_at_5
value: 57.989999999999995
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: mteb/cqadupstack-gaming
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 48.522999999999996
- type: map_at_10
value: 62.844
- type: map_at_100
value: 62.844
- type: map_at_1000
value: 62.844
- type: map_at_20
value: 62.844
- type: map_at_3
value: 59.150999999999996
- type: map_at_5
value: 61.403
- type: mrr_at_1
value: 55.925000000000004
- type: mrr_at_10
value: 66.113
- type: mrr_at_100
value: 66.113
- type: mrr_at_1000
value: 66.113
- type: mrr_at_20
value: 66.113
- type: mrr_at_3
value: 63.783
- type: mrr_at_5
value: 65.212
- type: ndcg_at_1
value: 55.925000000000004
- type: ndcg_at_10
value: 68.869
- type: ndcg_at_100
value: 68.774
- type: ndcg_at_1000
value: 68.774
- type: ndcg_at_20
value: 68.777
- type: ndcg_at_3
value: 63.31400000000001
- type: ndcg_at_5
value: 66.247
- type: precision_at_1
value: 55.925000000000004
- type: precision_at_10
value: 10.997
- type: precision_at_100
value: 1.0999999999999999
- type: precision_at_1000
value: 0.11
- type: precision_at_20
value: 5.498
- type: precision_at_3
value: 28.359
- type: precision_at_5
value: 19.386
- type: recall_at_1
value: 48.522999999999996
- type: recall_at_10
value: 83.045
- type: recall_at_100
value: 83.045
- type: recall_at_1000
value: 83.045
- type: recall_at_20
value: 83.045
- type: recall_at_3
value: 68.449
- type: recall_at_5
value: 75.62100000000001
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: mteb/cqadupstack-gis
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 30.726
- type: map_at_10
value: 40.433
- type: map_at_100
value: 40.433
- type: map_at_1000
value: 40.433
- type: map_at_20
value: 40.433
- type: map_at_3
value: 37.135
- type: map_at_5
value: 39.17
- type: mrr_at_1
value: 33.672000000000004
- type: mrr_at_10
value: 42.836
- type: mrr_at_100
value: 42.836
- type: mrr_at_1000
value: 42.836
- type: mrr_at_20
value: 42.836
- type: mrr_at_3
value: 39.755
- type: mrr_at_5
value: 41.631
- type: ndcg_at_1
value: 33.672000000000004
- type: ndcg_at_10
value: 46.092
- type: ndcg_at_100
value: 46.092
- type: ndcg_at_1000
value: 46.092
- type: ndcg_at_20
value: 46.092
- type: ndcg_at_3
value: 39.797
- type: ndcg_at_5
value: 43.171
- type: precision_at_1
value: 33.672000000000004
- type: precision_at_10
value: 7.073
- type: precision_at_100
value: 0.707
- type: precision_at_1000
value: 0.07100000000000001
- type: precision_at_20
value: 3.537
- type: precision_at_3
value: 16.648
- type: precision_at_5
value: 11.91
- type: recall_at_1
value: 30.726
- type: recall_at_10
value: 61.24000000000001
- type: recall_at_100
value: 61.24000000000001
- type: recall_at_1000
value: 61.24000000000001
- type: recall_at_20
value: 61.24000000000001
- type: recall_at_3
value: 44.557
- type: recall_at_5
value: 52.608999999999995
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: mteb/cqadupstack-mathematica
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 21.554000000000002
- type: map_at_10
value: 31.508000000000003
- type: map_at_100
value: 31.508000000000003
- type: map_at_1000
value: 31.508000000000003
- type: map_at_20
value: 31.508000000000003
- type: map_at_3
value: 28.225
- type: map_at_5
value: 30.043
- type: mrr_at_1
value: 27.114
- type: mrr_at_10
value: 36.631
- type: mrr_at_100
value: 36.631
- type: mrr_at_1000
value: 36.631
- type: mrr_at_20
value: 36.631
- type: mrr_at_3
value: 34.059
- type: mrr_at_5
value: 35.601
- type: ndcg_at_1
value: 27.114
- type: ndcg_at_10
value: 37.592999999999996
- type: ndcg_at_100
value: 37.588
- type: ndcg_at_1000
value: 37.588
- type: ndcg_at_20
value: 37.588
- type: ndcg_at_3
value: 32.038
- type: ndcg_at_5
value: 34.689
- type: precision_at_1
value: 27.114
- type: precision_at_10
value: 7.090000000000001
- type: precision_at_100
value: 0.709
- type: precision_at_1000
value: 0.07100000000000001
- type: precision_at_20
value: 3.5450000000000004
- type: precision_at_3
value: 15.506
- type: precision_at_5
value: 11.393
- type: recall_at_1
value: 21.554000000000002
- type: recall_at_10
value: 50.879
- type: recall_at_100
value: 50.879
- type: recall_at_1000
value: 50.879
- type: recall_at_20
value: 50.879
- type: recall_at_3
value: 35.827999999999996
- type: recall_at_5
value: 42.476
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: mteb/cqadupstack-physics
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 35.36
- type: map_at_10
value: 48.483
- type: map_at_100
value: 48.483
- type: map_at_1000
value: 48.483
- type: map_at_20
value: 48.483
- type: map_at_3
value: 44.639
- type: map_at_5
value: 46.698
- type: mrr_at_1
value: 43.985
- type: mrr_at_10
value: 54.039
- type: mrr_at_100
value: 54.039
- type: mrr_at_1000
value: 54.039
- type: mrr_at_20
value: 54.039
- type: mrr_at_3
value: 51.54
- type: mrr_at_5
value: 52.859
- type: ndcg_at_1
value: 43.985
- type: ndcg_at_10
value: 55.069
- type: ndcg_at_100
value: 54.967
- type: ndcg_at_1000
value: 54.967
- type: ndcg_at_20
value: 54.996
- type: ndcg_at_3
value: 49.544
- type: ndcg_at_5
value: 51.932
- type: precision_at_1
value: 43.985
- type: precision_at_10
value: 10.202
- type: precision_at_100
value: 1.02
- type: precision_at_1000
value: 0.10200000000000001
- type: precision_at_20
value: 5.101
- type: precision_at_3
value: 23.933
- type: precision_at_5
value: 16.901
- type: recall_at_1
value: 35.36
- type: recall_at_10
value: 68.806
- type: recall_at_100
value: 68.806
- type: recall_at_1000
value: 68.806
- type: recall_at_20
value: 68.806
- type: recall_at_3
value: 52.714000000000006
- type: recall_at_5
value: 59.168
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: mteb/cqadupstack-programmers
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 32.431
- type: map_at_10
value: 45.421
- type: map_at_100
value: 45.421
- type: map_at_1000
value: 45.421
- type: map_at_20
value: 45.421
- type: map_at_3
value: 41.82
- type: map_at_5
value: 43.692
- type: mrr_at_1
value: 41.096
- type: mrr_at_10
value: 51.293
- type: mrr_at_100
value: 51.293
- type: mrr_at_1000
value: 51.293
- type: mrr_at_20
value: 51.293
- type: mrr_at_3
value: 49.049
- type: mrr_at_5
value: 50.327
- type: ndcg_at_1
value: 41.096
- type: ndcg_at_10
value: 52.032999999999994
- type: ndcg_at_100
value: 51.903
- type: ndcg_at_1000
value: 51.897999999999996
- type: ndcg_at_20
value: 51.942
- type: ndcg_at_3
value: 47.024
- type: ndcg_at_5
value: 49.071
- type: precision_at_1
value: 41.096
- type: precision_at_10
value: 9.725999999999999
- type: precision_at_100
value: 0.9730000000000001
- type: precision_at_1000
value: 0.097
- type: precision_at_20
value: 4.8629999999999995
- type: precision_at_3
value: 23.097
- type: precision_at_5
value: 16.096
- type: recall_at_1
value: 32.431
- type: recall_at_10
value: 65.42999999999999
- type: recall_at_100
value: 65.42999999999999
- type: recall_at_1000
value: 65.42999999999999
- type: recall_at_20
value: 65.42999999999999
- type: recall_at_3
value: 50.856
- type: recall_at_5
value: 56.846
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: mteb/cqadupstack
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 32.074749999999995
- type: map_at_10
value: 43.474
- type: map_at_100
value: 43.474
- type: map_at_1000
value: 43.474
- type: map_at_20
value: 43.474
- type: map_at_3
value: 40.10458333333333
- type: map_at_5
value: 42.010749999999994
- type: mrr_at_1
value: 38.60425
- type: mrr_at_10
value: 48.05550000000001
- type: mrr_at_100
value: 48.05550000000001
- type: mrr_at_1000
value: 48.05550000000001
- type: mrr_at_20
value: 48.05550000000001
- type: mrr_at_3
value: 45.58083333333334
- type: mrr_at_5
value: 47.04750000000001
- type: ndcg_at_1
value: 38.60425
- type: ndcg_at_10
value: 49.51958333333334
- type: ndcg_at_100
value: 49.3385
- type: ndcg_at_1000
value: 49.33491666666667
- type: ndcg_at_20
value: 49.393
- type: ndcg_at_3
value: 44.32699999999999
- type: ndcg_at_5
value: 46.81008333333333
- type: precision_at_1
value: 38.60425
- type: precision_at_10
value: 8.800666666666668
- type: precision_at_100
value: 0.8800833333333334
- type: precision_at_1000
value: 0.08808333333333335
- type: precision_at_20
value: 4.400333333333334
- type: precision_at_3
value: 20.723166666666664
- type: precision_at_5
value: 14.65683333333333
- type: recall_at_1
value: 32.074749999999995
- type: recall_at_10
value: 62.5025
- type: recall_at_100
value: 62.5025
- type: recall_at_1000
value: 62.5025
- type: recall_at_20
value: 62.5025
- type: recall_at_3
value: 47.81091666666667
- type: recall_at_5
value: 54.38974999999999
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: mteb/cqadupstack-stats
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 28.758
- type: map_at_10
value: 37.633
- type: map_at_100
value: 37.633
- type: map_at_1000
value: 37.633
- type: map_at_20
value: 37.633
- type: map_at_3
value: 34.865
- type: map_at_5
value: 36.437999999999995
- type: mrr_at_1
value: 32.208999999999996
- type: mrr_at_10
value: 40.598
- type: mrr_at_100
value: 40.598
- type: mrr_at_1000
value: 40.598
- type: mrr_at_20
value: 40.598
- type: mrr_at_3
value: 37.935
- type: mrr_at_5
value: 39.476
- type: ndcg_at_1
value: 32.208999999999996
- type: ndcg_at_10
value: 42.798
- type: ndcg_at_100
value: 42.768
- type: ndcg_at_1000
value: 42.768
- type: ndcg_at_20
value: 42.768
- type: ndcg_at_3
value: 37.651
- type: ndcg_at_5
value: 40.172999999999995
- type: precision_at_1
value: 32.208999999999996
- type: precision_at_10
value: 6.84
- type: precision_at_100
value: 0.6839999999999999
- type: precision_at_1000
value: 0.068
- type: precision_at_20
value: 3.42
- type: precision_at_3
value: 16.258
- type: precision_at_5
value: 11.472
- type: recall_at_1
value: 28.758
- type: recall_at_10
value: 55.55799999999999
- type: recall_at_100
value: 55.55799999999999
- type: recall_at_1000
value: 55.55799999999999
- type: recall_at_20
value: 55.55799999999999
- type: recall_at_3
value: 41.488
- type: recall_at_5
value: 47.659
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: mteb/cqadupstack-tex
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 21.088
- type: map_at_10
value: 30.297
- type: map_at_100
value: 30.297
- type: map_at_1000
value: 30.297
- type: map_at_20
value: 30.297
- type: map_at_3
value: 27.376
- type: map_at_5
value: 29.064
- type: mrr_at_1
value: 26.358999999999998
- type: mrr_at_10
value: 34.996
- type: mrr_at_100
value: 34.996
- type: mrr_at_1000
value: 34.996
- type: mrr_at_20
value: 34.996
- type: mrr_at_3
value: 32.467
- type: mrr_at_5
value: 33.944
- type: ndcg_at_1
value: 26.358999999999998
- type: ndcg_at_10
value: 35.851
- type: ndcg_at_100
value: 35.731
- type: ndcg_at_1000
value: 35.729
- type: ndcg_at_20
value: 35.77
- type: ndcg_at_3
value: 30.97
- type: ndcg_at_5
value: 33.312000000000005
- type: precision_at_1
value: 26.358999999999998
- type: precision_at_10
value: 6.641
- type: precision_at_100
value: 0.664
- type: precision_at_1000
value: 0.066
- type: precision_at_20
value: 3.321
- type: precision_at_3
value: 14.923
- type: precision_at_5
value: 10.86
- type: recall_at_1
value: 21.088
- type: recall_at_10
value: 47.818
- type: recall_at_100
value: 47.818
- type: recall_at_1000
value: 47.818
- type: recall_at_20
value: 47.818
- type: recall_at_3
value: 33.815
- type: recall_at_5
value: 39.973
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: mteb/cqadupstack-unix
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 33.579
- type: map_at_10
value: 44.875
- type: map_at_100
value: 44.875
- type: map_at_1000
value: 44.875
- type: map_at_20
value: 44.875
- type: map_at_3
value: 41.64
- type: map_at_5
value: 43.433
- type: mrr_at_1
value: 40.111999999999995
- type: mrr_at_10
value: 49.586999999999996
- type: mrr_at_100
value: 49.586999999999996
- type: mrr_at_1000
value: 49.586999999999996
- type: mrr_at_20
value: 49.586999999999996
- type: mrr_at_3
value: 47.233000000000004
- type: mrr_at_5
value: 48.613
- type: ndcg_at_1
value: 40.111999999999995
- type: ndcg_at_10
value: 50.836000000000006
- type: ndcg_at_100
value: 50.822
- type: ndcg_at_1000
value: 50.822
- type: ndcg_at_20
value: 50.822
- type: ndcg_at_3
value: 45.737
- type: ndcg_at_5
value: 48.081
- type: precision_at_1
value: 40.111999999999995
- type: precision_at_10
value: 8.674999999999999
- type: precision_at_100
value: 0.868
- type: precision_at_1000
value: 0.087
- type: precision_at_20
value: 4.338
- type: precision_at_3
value: 21.02
- type: precision_at_5
value: 14.682999999999998
- type: recall_at_1
value: 33.579
- type: recall_at_10
value: 64.02600000000001
- type: recall_at_100
value: 64.02600000000001
- type: recall_at_1000
value: 64.02600000000001
- type: recall_at_20
value: 64.02600000000001
- type: recall_at_3
value: 49.788
- type: recall_at_5
value: 55.931
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: mteb/cqadupstack-webmasters
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 31.497999999999998
- type: map_at_10
value: 43.456
- type: map_at_100
value: 43.456
- type: map_at_1000
value: 43.456
- type: map_at_20
value: 43.456
- type: map_at_3
value: 40.125
- type: map_at_5
value: 41.829
- type: mrr_at_1
value: 38.735
- type: mrr_at_10
value: 48.756
- type: mrr_at_100
value: 48.756
- type: mrr_at_1000
value: 48.756
- type: mrr_at_20
value: 48.756
- type: mrr_at_3
value: 46.113
- type: mrr_at_5
value: 47.684
- type: ndcg_at_1
value: 38.735
- type: ndcg_at_10
value: 50.241
- type: ndcg_at_100
value: 49.458
- type: ndcg_at_1000
value: 49.437999999999995
- type: ndcg_at_20
value: 49.756
- type: ndcg_at_3
value: 45.14
- type: ndcg_at_5
value: 47.406
- type: precision_at_1
value: 38.735
- type: precision_at_10
value: 9.763
- type: precision_at_100
value: 0.976
- type: precision_at_1000
value: 0.098
- type: precision_at_20
value: 4.881
- type: precision_at_3
value: 21.673000000000002
- type: precision_at_5
value: 15.455
- type: recall_at_1
value: 31.497999999999998
- type: recall_at_10
value: 62.568999999999996
- type: recall_at_100
value: 62.568999999999996
- type: recall_at_1000
value: 62.568999999999996
- type: recall_at_20
value: 62.568999999999996
- type: recall_at_3
value: 47.842
- type: recall_at_5
value: 54.159
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval
type: mteb/cqadupstack-wordpress
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 24.991
- type: map_at_10
value: 34.183
- type: map_at_100
value: 34.183
- type: map_at_1000
value: 34.183
- type: map_at_20
value: 34.183
- type: map_at_3
value: 31.592
- type: map_at_5
value: 33.121
- type: mrr_at_1
value: 27.172
- type: mrr_at_10
value: 36.463
- type: mrr_at_100
value: 36.463
- type: mrr_at_1000
value: 36.463
- type: mrr_at_20
value: 36.463
- type: mrr_at_3
value: 34.165
- type: mrr_at_5
value: 35.616
- type: ndcg_at_1
value: 27.172
- type: ndcg_at_10
value: 39.311
- type: ndcg_at_100
value: 39.292
- type: ndcg_at_1000
value: 39.292
- type: ndcg_at_20
value: 39.301
- type: ndcg_at_3
value: 34.498
- type: ndcg_at_5
value: 37.006
- type: precision_at_1
value: 27.172
- type: precision_at_10
value: 6.174
- type: precision_at_100
value: 0.617
- type: precision_at_1000
value: 0.062
- type: precision_at_20
value: 3.087
- type: precision_at_3
value: 14.972
- type: precision_at_5
value: 10.499
- type: recall_at_1
value: 24.991
- type: recall_at_10
value: 52.649
- type: recall_at_100
value: 52.649
- type: recall_at_1000
value: 52.649
- type: recall_at_20
value: 52.649
- type: recall_at_3
value: 39.818
- type: recall_at_5
value: 45.927
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: mteb/climate-fever
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 12.475999999999999
- type: map_at_10
value: 22.999
- type: map_at_100
value: 22.999
- type: map_at_1000
value: 22.999
- type: map_at_20
value: 22.999
- type: map_at_3
value: 18.804000000000002
- type: map_at_5
value: 20.987000000000002
- type: mrr_at_1
value: 28.404
- type: mrr_at_10
value: 42.335
- type: mrr_at_100
value: 42.335
- type: mrr_at_1000
value: 42.335
- type: mrr_at_20
value: 42.335
- type: mrr_at_3
value: 39.11
- type: mrr_at_5
value: 40.953
- type: ndcg_at_1
value: 28.404
- type: ndcg_at_10
value: 32.467
- type: ndcg_at_100
value: 32.467
- type: ndcg_at_1000
value: 32.467
- type: ndcg_at_20
value: 32.467
- type: ndcg_at_3
value: 26.334999999999997
- type: ndcg_at_5
value: 28.493000000000002
- type: precision_at_1
value: 28.404
- type: precision_at_10
value: 10.43
- type: precision_at_100
value: 1.043
- type: precision_at_1000
value: 0.104
- type: precision_at_20
value: 5.215
- type: precision_at_3
value: 20.13
- type: precision_at_5
value: 15.595999999999998
- type: recall_at_1
value: 12.475999999999999
- type: recall_at_10
value: 39.757
- type: recall_at_100
value: 39.757
- type: recall_at_1000
value: 39.757
- type: recall_at_20
value: 39.757
- type: recall_at_3
value: 24.695
- type: recall_at_5
value: 30.864000000000004
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: mteb/dbpedia
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 9.261999999999999
- type: map_at_10
value: 23.807000000000002
- type: map_at_100
value: 23.807000000000002
- type: map_at_1000
value: 23.807000000000002
- type: map_at_20
value: 23.807000000000002
- type: map_at_3
value: 15.776000000000002
- type: map_at_5
value: 19.17
- type: mrr_at_1
value: 71.75
- type: mrr_at_10
value: 79.959
- type: mrr_at_100
value: 79.959
- type: mrr_at_1000
value: 79.959
- type: mrr_at_20
value: 79.959
- type: mrr_at_3
value: 78.625
- type: mrr_at_5
value: 79.412
- type: ndcg_at_1
value: 59.5
- type: ndcg_at_10
value: 48.988
- type: ndcg_at_100
value: 37.452000000000005
- type: ndcg_at_1000
value: 37.32
- type: ndcg_at_20
value: 41.387
- type: ndcg_at_3
value: 52.567
- type: ndcg_at_5
value: 50.649
- type: precision_at_1
value: 71.75
- type: precision_at_10
value: 40.425
- type: precision_at_100
value: 4.042
- type: precision_at_1000
value: 0.404
- type: precision_at_20
value: 20.212
- type: precision_at_3
value: 57.75
- type: precision_at_5
value: 50.349999999999994
- type: recall_at_1
value: 9.261999999999999
- type: recall_at_10
value: 30.329
- type: recall_at_100
value: 30.329
- type: recall_at_1000
value: 30.329
- type: recall_at_20
value: 30.329
- type: recall_at_3
value: 17.422
- type: recall_at_5
value: 22.598
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 52.014999999999986
- type: f1
value: 47.33036786740981
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: mteb/fever
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 82.00800000000001
- type: map_at_10
value: 88.02799999999999
- type: map_at_100
value: 88.02799999999999
- type: map_at_1000
value: 88.02799999999999
- type: map_at_20
value: 88.02799999999999
- type: map_at_3
value: 87.249
- type: map_at_5
value: 87.78399999999999
- type: mrr_at_1
value: 88.299
- type: mrr_at_10
value: 92.92
- type: mrr_at_100
value: 92.92
- type: mrr_at_1000
value: 92.92
- type: mrr_at_20
value: 92.92
- type: mrr_at_3
value: 92.56400000000001
- type: mrr_at_5
value: 92.83200000000001
- type: ndcg_at_1
value: 88.299
- type: ndcg_at_10
value: 90.88000000000001
- type: ndcg_at_100
value: 90.879
- type: ndcg_at_1000
value: 90.879
- type: ndcg_at_20
value: 90.879
- type: ndcg_at_3
value: 89.85499999999999
- type: ndcg_at_5
value: 90.485
- type: precision_at_1
value: 88.299
- type: precision_at_10
value: 10.522
- type: precision_at_100
value: 1.052
- type: precision_at_1000
value: 0.105
- type: precision_at_20
value: 5.261
- type: precision_at_3
value: 33.573
- type: precision_at_5
value: 20.633000000000003
- type: recall_at_1
value: 82.00800000000001
- type: recall_at_10
value: 94.952
- type: recall_at_100
value: 94.952
- type: recall_at_1000
value: 94.952
- type: recall_at_20
value: 94.952
- type: recall_at_3
value: 92.089
- type: recall_at_5
value: 93.794
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: mteb/fiqa
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 26.857
- type: map_at_10
value: 44.645
- type: map_at_100
value: 44.645
- type: map_at_1000
value: 44.645
- type: map_at_20
value: 44.645
- type: map_at_3
value: 38.166
- type: map_at_5
value: 41.992000000000004
- type: mrr_at_1
value: 50.309000000000005
- type: mrr_at_10
value: 59.59100000000001
- type: mrr_at_100
value: 59.59100000000001
- type: mrr_at_1000
value: 59.59100000000001
- type: mrr_at_20
value: 59.59100000000001
- type: mrr_at_3
value: 56.97
- type: mrr_at_5
value: 58.498000000000005
- type: ndcg_at_1
value: 50.309000000000005
- type: ndcg_at_10
value: 53.221
- type: ndcg_at_100
value: 53.15800000000001
- type: ndcg_at_1000
value: 53.15800000000001
- type: ndcg_at_20
value: 53.15800000000001
- type: ndcg_at_3
value: 47.506
- type: ndcg_at_5
value: 49.922
- type: precision_at_1
value: 50.309000000000005
- type: precision_at_10
value: 14.985000000000001
- type: precision_at_100
value: 1.498
- type: precision_at_1000
value: 0.15
- type: precision_at_20
value: 7.492
- type: precision_at_3
value: 31.635999999999996
- type: precision_at_5
value: 24.043
- type: recall_at_1
value: 26.857
- type: recall_at_10
value: 62.051
- type: recall_at_100
value: 62.051
- type: recall_at_1000
value: 62.051
- type: recall_at_20
value: 62.051
- type: recall_at_3
value: 42.966
- type: recall_at_5
value: 51.943
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: mteb/hotpotqa
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 40.891
- type: map_at_10
value: 70.431
- type: map_at_100
value: 70.431
- type: map_at_1000
value: 70.431
- type: map_at_20
value: 70.431
- type: map_at_3
value: 66.704
- type: map_at_5
value: 69.179
- type: mrr_at_1
value: 81.783
- type: mrr_at_10
value: 87.368
- type: mrr_at_100
value: 87.368
- type: mrr_at_1000
value: 87.368
- type: mrr_at_20
value: 87.368
- type: mrr_at_3
value: 86.59700000000001
- type: mrr_at_5
value: 87.128
- type: ndcg_at_1
value: 81.783
- type: ndcg_at_10
value: 77.697
- type: ndcg_at_100
value: 77.697
- type: ndcg_at_1000
value: 77.697
- type: ndcg_at_20
value: 77.697
- type: ndcg_at_3
value: 72.688
- type: ndcg_at_5
value: 75.69200000000001
- type: precision_at_1
value: 81.783
- type: precision_at_10
value: 16.488
- type: precision_at_100
value: 1.649
- type: precision_at_1000
value: 0.165
- type: precision_at_20
value: 8.244
- type: precision_at_3
value: 47.693000000000005
- type: precision_at_5
value: 30.976
- type: recall_at_1
value: 40.891
- type: recall_at_10
value: 82.438
- type: recall_at_100
value: 82.438
- type: recall_at_1000
value: 82.438
- type: recall_at_20
value: 82.438
- type: recall_at_3
value: 71.54
- type: recall_at_5
value: 77.441
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 89.47240000000001
- type: ap
value: 85.75618304701787
- type: f1
value: 89.44156774176075
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: mteb/msmarco
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: map_at_1
value: 19.941
- type: map_at_10
value: 33.108
- type: map_at_100
value: 33.108
- type: map_at_1000
value: 33.108
- type: map_at_20
value: 33.108
- type: map_at_3
value: 28.716
- type: map_at_5
value: 31.255
- type: mrr_at_1
value: 20.458000000000002
- type: mrr_at_10
value: 33.646
- type: mrr_at_100
value: 33.646
- type: mrr_at_1000
value: 33.646
- type: mrr_at_20
value: 33.646
- type: mrr_at_3
value: 29.360000000000003
- type: mrr_at_5
value: 31.849
- type: ndcg_at_1
value: 20.458000000000002
- type: ndcg_at_10
value: 40.664
- type: ndcg_at_100
value: 40.664
- type: ndcg_at_1000
value: 40.664
- type: ndcg_at_20
value: 40.664
- type: ndcg_at_3
value: 31.733
- type: ndcg_at_5
value: 36.266999999999996
- type: precision_at_1
value: 20.458000000000002
- type: precision_at_10
value: 6.703
- type: precision_at_100
value: 0.67
- type: precision_at_1000
value: 0.067
- type: precision_at_20
value: 3.3520000000000003
- type: precision_at_3
value: 13.777000000000001
- type: precision_at_5
value: 10.564
- type: recall_at_1
value: 19.941
- type: recall_at_10
value: 64.103
- type: recall_at_100
value: 64.103
- type: recall_at_1000
value: 64.103
- type: recall_at_20
value: 64.103
- type: recall_at_3
value: 39.800999999999995
- type: recall_at_5
value: 50.727999999999994
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 96.45690834473322
- type: f1
value: 96.19980363353172
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 85.38075695394436
- type: f1
value: 71.33409850817071
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 80.12104909213183
- type: f1
value: 77.26691038674358
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 82.69670477471418
- type: f1
value: 82.31935226516424
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 32.733209733023
- type: v_measures
value:
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 32.325298069936835
- type: v_measures
value:
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 32.511595472837335
- type: mrr
value: 33.73044905745997
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: mteb/nfcorpus
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 7.0760000000000005
- type: map_at_10
value: 16.039
- type: map_at_100
value: 16.039
- type: map_at_1000
value: 16.039
- type: map_at_20
value: 16.039
- type: map_at_3
value: 11.408
- type: map_at_5
value: 13.547
- type: mrr_at_1
value: 53.559999999999995
- type: mrr_at_10
value: 61.531000000000006
- type: mrr_at_100
value: 61.531000000000006
- type: mrr_at_1000
value: 61.531000000000006
- type: mrr_at_20
value: 61.531000000000006
- type: mrr_at_3
value: 59.236
- type: mrr_at_5
value: 60.49
- type: ndcg_at_1
value: 51.083999999999996
- type: ndcg_at_10
value: 41.332
- type: ndcg_at_100
value: 27.083000000000002
- type: ndcg_at_1000
value: 26.619
- type: ndcg_at_20
value: 33.188
- type: ndcg_at_3
value: 46.605999999999995
- type: ndcg_at_5
value: 44.362
- type: precision_at_1
value: 52.941
- type: precision_at_10
value: 30.65
- type: precision_at_100
value: 3.065
- type: precision_at_1000
value: 0.307
- type: precision_at_20
value: 15.325
- type: precision_at_3
value: 43.447
- type: precision_at_5
value: 38.266
- type: recall_at_1
value: 7.0760000000000005
- type: recall_at_10
value: 20.929000000000002
- type: recall_at_100
value: 20.929000000000002
- type: recall_at_1000
value: 20.929000000000002
- type: recall_at_20
value: 20.929000000000002
- type: recall_at_3
value: 12.601
- type: recall_at_5
value: 15.955
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: mteb/nq
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 39.204
- type: map_at_10
value: 56.808
- type: map_at_100
value: 56.808
- type: map_at_1000
value: 56.808
- type: map_at_20
value: 56.808
- type: map_at_3
value: 52.471999999999994
- type: map_at_5
value: 55.191
- type: mrr_at_1
value: 44.032
- type: mrr_at_10
value: 59.158
- type: mrr_at_100
value: 59.158
- type: mrr_at_1000
value: 59.158
- type: mrr_at_20
value: 59.158
- type: mrr_at_3
value: 55.948
- type: mrr_at_5
value: 57.96
- type: ndcg_at_1
value: 44.032
- type: ndcg_at_10
value: 64.672
- type: ndcg_at_100
value: 64.672
- type: ndcg_at_1000
value: 64.672
- type: ndcg_at_20
value: 64.672
- type: ndcg_at_3
value: 56.955999999999996
- type: ndcg_at_5
value: 61.278999999999996
- type: precision_at_1
value: 44.032
- type: precision_at_10
value: 10.295
- type: precision_at_100
value: 1.03
- type: precision_at_1000
value: 0.10300000000000001
- type: precision_at_20
value: 5.148
- type: precision_at_3
value: 25.83
- type: precision_at_5
value: 18.053
- type: recall_at_1
value: 39.204
- type: recall_at_10
value: 85.936
- type: recall_at_100
value: 85.936
- type: recall_at_1000
value: 85.936
- type: recall_at_20
value: 85.936
- type: recall_at_3
value: 66.387
- type: recall_at_5
value: 76.238
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: mteb/quora
config: default
split: test
revision: e4e08e0b7dbe3c8700f0daef558ff32256715259
metrics:
- type: map_at_1
value: 71.068
- type: map_at_10
value: 85.271
- type: map_at_100
value: 85.271
- type: map_at_1000
value: 85.271
- type: map_at_20
value: 85.271
- type: map_at_3
value: 82.23899999999999
- type: map_at_5
value: 84.165
- type: mrr_at_1
value: 81.85
- type: mrr_at_10
value: 87.856
- type: mrr_at_100
value: 87.856
- type: mrr_at_1000
value: 87.856
- type: mrr_at_20
value: 87.856
- type: mrr_at_3
value: 86.925
- type: mrr_at_5
value: 87.559
- type: ndcg_at_1
value: 81.89
- type: ndcg_at_10
value: 88.856
- type: ndcg_at_100
value: 88.723
- type: ndcg_at_1000
value: 88.723
- type: ndcg_at_20
value: 88.74300000000001
- type: ndcg_at_3
value: 86.05199999999999
- type: ndcg_at_5
value: 87.61
- type: precision_at_1
value: 81.89
- type: precision_at_10
value: 13.569999999999999
- type: precision_at_100
value: 1.357
- type: precision_at_1000
value: 0.136
- type: precision_at_20
value: 6.784999999999999
- type: precision_at_3
value: 37.807
- type: precision_at_5
value: 24.908
- type: recall_at_1
value: 71.068
- type: recall_at_10
value: 95.797
- type: recall_at_100
value: 95.797
- type: recall_at_1000
value: 95.797
- type: recall_at_20
value: 95.797
- type: recall_at_3
value: 87.65899999999999
- type: recall_at_5
value: 92.107
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 62.16385792305745
- type: v_measures
value:
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 385e3cb46b4cfa89021f56c4380204149d0efe33
metrics:
- type: v_measure
value: 65.96296778394698
- type: v_measures
value:
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: mteb/scidocs
config: default
split: test
revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88
metrics:
- type: map_at_1
value: 5.433000000000001
- type: map_at_10
value: 13.991000000000001
- type: map_at_100
value: 13.991000000000001
- type: map_at_1000
value: 13.991000000000001
- type: map_at_20
value: 13.991000000000001
- type: map_at_3
value: 9.708
- type: map_at_5
value: 11.849
- type: mrr_at_1
value: 26.8
- type: mrr_at_10
value: 38.012
- type: mrr_at_100
value: 38.012
- type: mrr_at_1000
value: 38.012
- type: mrr_at_20
value: 38.012
- type: mrr_at_3
value: 34.449999999999996
- type: mrr_at_5
value: 36.59
- type: ndcg_at_1
value: 26.8
- type: ndcg_at_10
value: 23.006999999999998
- type: ndcg_at_100
value: 23.006999999999998
- type: ndcg_at_1000
value: 23.006999999999998
- type: ndcg_at_20
value: 23.006999999999998
- type: ndcg_at_3
value: 21.386
- type: ndcg_at_5
value: 19.046
- type: precision_at_1
value: 26.8
- type: precision_at_10
value: 12.01
- type: precision_at_100
value: 1.201
- type: precision_at_1000
value: 0.12
- type: precision_at_20
value: 6.005
- type: precision_at_3
value: 19.833000000000002
- type: precision_at_5
value: 16.84
- type: recall_at_1
value: 5.433000000000001
- type: recall_at_10
value: 24.34
- type: recall_at_100
value: 24.34
- type: recall_at_1000
value: 24.34
- type: recall_at_20
value: 24.34
- type: recall_at_3
value: 12.058
- type: recall_at_5
value: 17.058
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: 20a6d6f312dd54037fe07a32d58e5e168867909d
metrics:
- type: cos_sim_pearson
value: 84.84178272773948
- type: cos_sim_spearman
value: 82.32746830315172
- type: euclidean_pearson
value: 82.11599650658388
- type: euclidean_spearman
value: 82.38102437050075
- type: manhattan_pearson
value: 82.07071847892156
- type: manhattan_spearman
value: 82.35710877093594
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 86.86916828280668
- type: cos_sim_spearman
value: 79.69553254808825
- type: euclidean_pearson
value: 82.86582224049857
- type: euclidean_spearman
value: 79.1765897124049
- type: manhattan_pearson
value: 83.15978473993391
- type: manhattan_spearman
value: 79.54192003597332
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 88.7719804239987
- type: cos_sim_spearman
value: 89.20788765830103
- type: euclidean_pearson
value: 88.67624029627581
- type: euclidean_spearman
value: 89.15058058277351
- type: manhattan_pearson
value: 88.43477620818435
- type: manhattan_spearman
value: 89.01994285052193
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 87.04733612348426
- type: cos_sim_spearman
value: 86.0120242985069
- type: euclidean_pearson
value: 86.07045247599824
- type: euclidean_spearman
value: 86.22185577032168
- type: manhattan_pearson
value: 85.79555943035328
- type: manhattan_spearman
value: 86.13821651705776
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 89.395594115739
- type: cos_sim_spearman
value: 89.70312809978681
- type: euclidean_pearson
value: 89.10137224981938
- type: euclidean_spearman
value: 89.74149793061072
- type: manhattan_pearson
value: 89.06144914118401
- type: manhattan_spearman
value: 89.78489015365638
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 86.1720394205624
- type: cos_sim_spearman
value: 87.67900288178751
- type: euclidean_pearson
value: 86.73052291563968
- type: euclidean_spearman
value: 87.49116803671033
- type: manhattan_pearson
value: 86.79988999910331
- type: manhattan_spearman
value: 87.57540934207157
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 88.75286004155564
- type: cos_sim_spearman
value: 88.03161515281518
- type: euclidean_pearson
value: 88.55464128719427
- type: euclidean_spearman
value: 87.78041200668837
- type: manhattan_pearson
value: 88.18469209314583
- type: manhattan_spearman
value: 87.31602253333598
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 70.48372140035973
- type: cos_sim_spearman
value: 70.16107814793419
- type: euclidean_pearson
value: 69.65789511103976
- type: euclidean_spearman
value: 68.92441073988654
- type: manhattan_pearson
value: 69.55306498752127
- type: manhattan_spearman
value: 68.82186378798527
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 87.43017430741797
- type: cos_sim_spearman
value: 88.14675226940803
- type: euclidean_pearson
value: 87.33329490848514
- type: euclidean_spearman
value: 87.94164481397011
- type: manhattan_pearson
value: 87.19303598684772
- type: manhattan_spearman
value: 87.86899889639051
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 87.03073019943413
- type: mrr
value: 96.67456726280255
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: mteb/scifact
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 64.328
- type: map_at_10
value: 75.046
- type: map_at_100
value: 75.046
- type: map_at_1000
value: 75.046
- type: map_at_20
value: 75.046
- type: map_at_3
value: 72.42
- type: map_at_5
value: 73.88900000000001
- type: mrr_at_1
value: 67.667
- type: mrr_at_10
value: 76.19200000000001
- type: mrr_at_100
value: 76.19200000000001
- type: mrr_at_1000
value: 76.19200000000001
- type: mrr_at_20
value: 76.19200000000001
- type: mrr_at_3
value: 74.556
- type: mrr_at_5
value: 75.372
- type: ndcg_at_1
value: 67.667
- type: ndcg_at_10
value: 79.621
- type: ndcg_at_100
value: 79.621
- type: ndcg_at_1000
value: 79.621
- type: ndcg_at_20
value: 79.621
- type: ndcg_at_3
value: 75.506
- type: ndcg_at_5
value: 77.269
- type: precision_at_1
value: 67.667
- type: precision_at_10
value: 10.467
- type: precision_at_100
value: 1.047
- type: precision_at_1000
value: 0.105
- type: precision_at_20
value: 5.2330000000000005
- type: precision_at_3
value: 29.444
- type: precision_at_5
value: 19.133
- type: recall_at_1
value: 64.328
- type: recall_at_10
value: 92.389
- type: recall_at_100
value: 92.389
- type: recall_at_1000
value: 92.389
- type: recall_at_20
value: 92.389
- type: recall_at_3
value: 81.183
- type: recall_at_5
value: 85.60600000000001
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.83762376237624
- type: cos_sim_ap
value: 96.51580702723564
- type: cos_sim_f1
value: 91.63265306122449
- type: cos_sim_precision
value: 93.54166666666667
- type: cos_sim_recall
value: 89.8
- type: dot_accuracy
value: 99.73663366336633
- type: dot_ap
value: 93.5764284433306
- type: dot_f1
value: 86.56565656565655
- type: dot_precision
value: 87.44897959183675
- type: dot_recall
value: 85.7
- type: euclidean_accuracy
value: 99.84059405940594
- type: euclidean_ap
value: 96.4738308210008
- type: euclidean_f1
value: 91.76470588235294
- type: euclidean_precision
value: 93.92670157068062
- type: euclidean_recall
value: 89.7
- type: manhattan_accuracy
value: 99.84356435643565
- type: manhattan_ap
value: 96.58366196890644
- type: manhattan_f1
value: 91.93054136874362
- type: manhattan_precision
value: 93.94572025052193
- type: manhattan_recall
value: 90.0
- type: max_accuracy
value: 99.84356435643565
- type: max_ap
value: 96.58366196890644
- type: max_f1
value: 91.93054136874362
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 71.3538865724681
- type: v_measures
value:
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 36.11009155563876
- type: v_measures
value:
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 55.54551767207771
- type: mrr
value: 56.55926385705797
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.805678984951985
- type: cos_sim_spearman
value: 30.827574116605362
- type: dot_pearson
value: 29.899814768586204
- type: dot_spearman
value: 29.588760095881174
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: mteb/trec-covid
config: default
split: test
revision: bb9466bac8153a0349341eb1b22e06409e78ef4e
metrics:
- type: map_at_1
value: 0.22200000000000003
- type: map_at_10
value: 2.046
- type: map_at_100
value: 2.046
- type: map_at_1000
value: 2.046
- type: map_at_20
value: 2.046
- type: map_at_3
value: 0.661
- type: map_at_5
value: 1.057
- type: mrr_at_1
value: 84.0
- type: mrr_at_10
value: 91.333
- type: mrr_at_100
value: 91.333
- type: mrr_at_1000
value: 91.333
- type: mrr_at_20
value: 91.333
- type: mrr_at_3
value: 91.0
- type: mrr_at_5
value: 91.0
- type: ndcg_at_1
value: 80.0
- type: ndcg_at_10
value: 80.74900000000001
- type: ndcg_at_100
value: 17.761
- type: ndcg_at_1000
value: 7.5920000000000005
- type: ndcg_at_20
value: 52.113
- type: ndcg_at_3
value: 83.542
- type: ndcg_at_5
value: 82.151
- type: precision_at_1
value: 84.0
- type: precision_at_10
value: 84.6
- type: precision_at_100
value: 8.459999999999999
- type: precision_at_1000
value: 0.8460000000000001
- type: precision_at_20
value: 42.3
- type: precision_at_3
value: 88.0
- type: precision_at_5
value: 86.0
- type: recall_at_1
value: 0.22200000000000003
- type: recall_at_10
value: 2.235
- type: recall_at_100
value: 2.235
- type: recall_at_1000
value: 2.235
- type: recall_at_20
value: 2.235
- type: recall_at_3
value: 0.695
- type: recall_at_5
value: 1.121
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: mteb/touche2020
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 3.2750000000000004
- type: map_at_10
value: 10.514
- type: map_at_100
value: 10.514
- type: map_at_1000
value: 10.514
- type: map_at_20
value: 10.514
- type: map_at_3
value: 5.662
- type: map_at_5
value: 7.808
- type: mrr_at_1
value: 40.816
- type: mrr_at_10
value: 49.88
- type: mrr_at_100
value: 49.88
- type: mrr_at_1000
value: 49.88
- type: mrr_at_20
value: 49.88
- type: mrr_at_3
value: 46.259
- type: mrr_at_5
value: 47.585
- type: ndcg_at_1
value: 37.755
- type: ndcg_at_10
value: 25.237
- type: ndcg_at_100
value: 21.149
- type: ndcg_at_1000
value: 21.149
- type: ndcg_at_20
value: 21.401999999999997
- type: ndcg_at_3
value: 27.465
- type: ndcg_at_5
value: 26.169999999999998
- type: precision_at_1
value: 40.816
- type: precision_at_10
value: 21.224
- type: precision_at_100
value: 2.122
- type: precision_at_1000
value: 0.212
- type: precision_at_20
value: 10.612
- type: precision_at_3
value: 26.531
- type: precision_at_5
value: 24.490000000000002
- type: recall_at_1
value: 3.2750000000000004
- type: recall_at_10
value: 16.264
- type: recall_at_100
value: 16.264
- type: recall_at_1000
value: 16.264
- type: recall_at_20
value: 16.264
- type: recall_at_3
value: 6.265999999999999
- type: recall_at_5
value: 9.677
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de
metrics:
- type: accuracy
value: 66.181640625
- type: ap
value: 12.61343083198892
- type: f1
value: 51.12214559856414
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 62.543859649122815
- type: f1
value: 62.742315191046295
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 54.7799424517948
- type: v_measures
value:
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 88.24581271979496
- type: cos_sim_ap
value: 81.34631603712425
- type: cos_sim_f1
value: 73.6588459099556
- type: cos_sim_precision
value: 70.91575091575092
- type: cos_sim_recall
value: 76.62269129287598
- type: dot_accuracy
value: 86.33247898909221
- type: dot_ap
value: 74.8713850965631
- type: dot_f1
value: 69.68152866242038
- type: dot_precision
value: 67.36453201970444
- type: dot_recall
value: 72.16358839050132
- type: euclidean_accuracy
value: 88.37098408535495
- type: euclidean_ap
value: 81.3880827682646
- type: euclidean_f1
value: 73.69367056104764
- type: euclidean_precision
value: 71.76794198549638
- type: euclidean_recall
value: 75.72559366754618
- type: manhattan_accuracy
value: 88.28157596709781
- type: manhattan_ap
value: 81.11568493905267
- type: manhattan_f1
value: 73.38364779874215
- type: manhattan_precision
value: 70.1201923076923
- type: manhattan_recall
value: 76.96569920844327
- type: max_accuracy
value: 88.37098408535495
- type: max_ap
value: 81.3880827682646
- type: max_f1
value: 73.69367056104764
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.54476656188147
- type: cos_sim_ap
value: 86.93964282285746
- type: cos_sim_f1
value: 79.50401702190103
- type: cos_sim_precision
value: 75.93020811435778
- type: cos_sim_recall
value: 83.43085925469664
- type: dot_accuracy
value: 88.64050917840649
- type: dot_ap
value: 84.81007248888473
- type: dot_f1
value: 77.95706670508572
- type: dot_precision
value: 73.24038982133189
- type: dot_recall
value: 83.32306744687403
- type: euclidean_accuracy
value: 89.53894516241705
- type: euclidean_ap
value: 86.92299719471643
- type: euclidean_f1
value: 79.55922060862585
- type: euclidean_precision
value: 75.61381606325426
- type: euclidean_recall
value: 83.93902063443178
- type: manhattan_accuracy
value: 89.5234214305119
- type: manhattan_ap
value: 86.93261273512803
- type: manhattan_f1
value: 79.54703705061019
- type: manhattan_precision
value: 75.90041261626688
- type: manhattan_recall
value: 83.56174930705266
- type: max_accuracy
value: 89.54476656188147
- type: max_ap
value: 86.93964282285746
- type: max_f1
value: 79.55922060862585
---
Details to run in https://github.com/raghavlite/TDTE | [
"SUMMARIZATION"
]
| [
"BIOSSES",
"SCIFACT"
]
| BioNLP |
Details to run in https://github.com/raghavlite/TDTE | {"tags": ["mteb"], "model-index": [{"name": "0523_mistralv2_sum3echo512_bbcc_8_16_16", "results": [{"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonCounterfactualClassification (en)", "type": "mteb/amazon_counterfactual", "config": "en", "split": "test", "revision": "e8379541af4e31359cca9fbcf4b00f2671dba205"}, "metrics": [{"type": "accuracy", "value": 79.65671641791045}, {"type": "ap", "value": 44.24063991266868}, {"type": "f1", "value": 73.91766997954294}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonPolarityClassification", "type": "mteb/amazon_polarity", "config": "default", "split": "test", "revision": "e2d317d38cd51312af73b3d32a06d1a08b442046"}, "metrics": [{"type": "accuracy", "value": 94.480125}, {"type": "ap", "value": 92.21829806116952}, {"type": "f1", "value": 94.47801150800291}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification (en)", "type": "mteb/amazon_reviews_multi", "config": "en", "split": "test", "revision": "1399c76144fd37290681b995c656ef9b2e06e26d"}, "metrics": [{"type": "accuracy", "value": 48.157999999999994}, {"type": "f1", "value": 47.11858175135973}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ArguAna", "type": "mteb/arguana", "config": "default", "split": "test", "revision": "c22ab2a51041ffd869aaddef7af8d8215647e41a"}, "metrics": [{"type": "map_at_1", "value": 31.935000000000002}, {"type": "map_at_10", "value": 49.482}, {"type": "map_at_100", "value": 49.482}, {"type": "map_at_1000", "value": 49.482}, {"type": "map_at_20", "value": 49.482}, {"type": "map_at_3", "value": 44.464}, {"type": "map_at_5", "value": 47.569}, {"type": "mrr_at_1", "value": 33.001000000000005}, {"type": "mrr_at_10", "value": 49.989}, {"type": "mrr_at_100", "value": 49.989}, {"type": "mrr_at_1000", "value": 49.989}, {"type": "mrr_at_20", "value": 49.989}, {"type": "mrr_at_3", "value": 44.903}, {"type": "mrr_at_5", "value": 48.054}, {"type": "ndcg_at_1", "value": 31.935000000000002}, {"type": "ndcg_at_10", "value": 58.819}, {"type": "ndcg_at_100", "value": 58.819}, {"type": "ndcg_at_1000", "value": 58.819}, {"type": "ndcg_at_20", "value": 58.819}, {"type": "ndcg_at_3", "value": 48.620000000000005}, {"type": "ndcg_at_5", "value": 54.230000000000004}, {"type": "precision_at_1", "value": 31.935000000000002}, {"type": "precision_at_10", "value": 8.841000000000001}, {"type": "precision_at_100", "value": 0.8840000000000001}, {"type": "precision_at_1000", "value": 0.08800000000000001}, {"type": "precision_at_20", "value": 4.42}, {"type": "precision_at_3", "value": 20.223}, {"type": "precision_at_5", "value": 14.865}, {"type": "recall_at_1", "value": 31.935000000000002}, {"type": "recall_at_10", "value": 88.407}, {"type": "recall_at_100", "value": 88.407}, {"type": "recall_at_1000", "value": 88.407}, {"type": "recall_at_20", "value": 88.407}, {"type": "recall_at_3", "value": 60.669}, {"type": "recall_at_5", "value": 74.324}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringP2P", "type": "mteb/arxiv-clustering-p2p", "config": "default", "split": "test", "revision": "a122ad7f3f0291bf49cc6f4d32aa80929df69d5d"}, "metrics": [{"type": "v_measure", "value": 48.7848435754835}, {"type": "v_measures", "value": [0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714, 0.4921178407713082, 0.4900811693910433, 0.5035743243257481, 0.49769690824686913, 0.484482240428649, 0.48877156706650865, 0.4917783921004695, 0.490848646915023, 0.49292827306716547, 0.4667863103804292, 0.5663892295430093, 0.5668130433770879, 0.5621288042146693, 0.5658463909906998, 0.5669889138453401, 0.5678202745454832, 0.5686559823111067, 0.5672351018082963, 0.554891045405333, 0.5661694307954689, 0.5309350425293812, 0.2938608518329288, 0.4844129096095996, 0.4282763304977941, 0.3635291849887843, 0.2962076070268785, 0.30324674572414795, 0.24299400753636727, 0.34506718232232675, 1.0, 0.28276775680196714]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringS2S", "type": "mteb/arxiv-clustering-s2s", "config": "default", "split": "test", "revision": "f910caf1a6075f7329cdf8c1a6135696f37dbd53"}, "metrics": [{"type": "v_measure", "value": 46.10665257880071}, {"type": "v_measures", "value": [0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363, 0.4791303592299426, 0.47312049608032497, 0.4855223164775998, 0.4571429771751102, 0.4762861002816672, 0.48218700555188587, 0.4774159340612887, 0.4706669107168955, 0.4817074105941521, 0.46122831822845595, 0.5323998509009684, 0.5366144743504581, 0.5350659892124341, 0.5348097376189661, 0.5361859305887842, 0.5401424740226736, 0.5386301513493418, 0.536195294071538, 0.5307019767098927, 0.529430500641798, 0.48993023034390504, 0.24840671183096288, 0.41882293476660615, 0.3892318610333167, 0.325751283253651, 0.24324245195504823, 0.2853604795144245, 0.23061705991870918, 0.31166614557038164, 1.0, 0.2554489333770363]}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB AskUbuntuDupQuestions", "type": "mteb/askubuntudupquestions-reranking", "config": "default", "split": "test", "revision": "2000358ca161889fa9c082cb41daa8dcfb161a54"}, "metrics": [{"type": "map", "value": 66.7285956124022}, {"type": "mrr", "value": 79.72233214615486}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB BIOSSES", "type": "mteb/biosses-sts", "config": "default", "split": "test", "revision": "d3fb88f8f02e40887cd149695127462bbcf29b4a"}, "metrics": [{"type": "cos_sim_pearson", "value": 88.73245869702066}, {"type": "cos_sim_spearman", "value": 87.28451895745819}, {"type": "euclidean_pearson", "value": 86.44569617089661}, {"type": "euclidean_spearman", "value": 86.7236628044763}, {"type": "manhattan_pearson", "value": 86.50853979799092}, {"type": "manhattan_spearman", "value": 86.75920578302187}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB Banking77Classification", "type": "mteb/banking77", "config": "default", "split": "test", "revision": "0fd18e25b25c072e09e0d92ab615fda904d66300"}, "metrics": [{"type": "accuracy", "value": 88.91233766233766}, {"type": "f1", "value": 88.86315189747688}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringP2P", "type": "mteb/biorxiv-clustering-p2p", "config": "default", "split": "test", "revision": "65b79d1d13f80053f67aca9498d9402c2d9f1f40"}, "metrics": [{"type": "v_measure", "value": 38.7850808112868}, {"type": "v_measures", "value": [0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933, 0.387862617887449, 0.38352827892371627, 0.371265066952095, 0.3774981384705982, 0.37131831220293676, 0.39149988570912153, 0.38703497665413544, 0.40930675826264357, 0.3910216974623904, 0.4081723486035933]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringS2S", "type": "mteb/biorxiv-clustering-s2s", "config": "default", "split": "test", "revision": "258694dd0231531bc1fd9de6ceb52a0853c6d908"}, "metrics": [{"type": "v_measure", "value": 37.37318034700008}, {"type": "v_measures", "value": [0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467, 0.36845423004088185, 0.38992061254062366, 0.3717948730004672, 0.36026627188254456, 0.3669860108798917, 0.36731355824516293, 0.375291529012098, 0.38550090432534534, 0.36577228218454805, 0.38601776258844467]}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackAndroidRetrieval", "type": "mteb/cqadupstack-android", "config": "default", "split": "test", "revision": "f46a197baaae43b4f621051089b82a364682dfeb"}, "metrics": [{"type": "map_at_1", "value": 39.232}, {"type": "map_at_10", "value": 53.04299999999999}, {"type": "map_at_100", "value": 53.04299999999999}, {"type": "map_at_1000", "value": 53.04299999999999}, {"type": "map_at_20", "value": 53.04299999999999}, {"type": "map_at_3", "value": 48.588}, {"type": "map_at_5", "value": 51.17699999999999}, {"type": "mrr_at_1", "value": 49.356}, {"type": "mrr_at_10", "value": 59.550000000000004}, {"type": "mrr_at_100", "value": 59.550000000000004}, {"type": "mrr_at_1000", "value": 59.550000000000004}, {"type": "mrr_at_20", "value": 59.550000000000004}, {"type": "mrr_at_3", "value": 56.986000000000004}, {"type": "mrr_at_5", "value": 58.638999999999996}, {"type": "ndcg_at_1", "value": 49.356}, {"type": "ndcg_at_10", "value": 60.156}, {"type": "ndcg_at_100", "value": 59.714999999999996}, {"type": "ndcg_at_1000", "value": 59.699000000000005}, {"type": "ndcg_at_20", "value": 59.831}, {"type": "ndcg_at_3", "value": 54.75299999999999}, {"type": "ndcg_at_5", "value": 57.443999999999996}, {"type": "precision_at_1", "value": 49.356}, {"type": "precision_at_10", "value": 11.86}, {"type": "precision_at_100", "value": 1.1860000000000002}, {"type": "precision_at_1000", "value": 0.11900000000000001}, {"type": "precision_at_20", "value": 5.93}, {"type": "precision_at_3", "value": 26.895999999999997}, {"type": "precision_at_5", "value": 19.570999999999998}, {"type": "recall_at_1", "value": 39.232}, {"type": "recall_at_10", "value": 72.98400000000001}, {"type": "recall_at_100", "value": 72.98400000000001}, {"type": "recall_at_1000", "value": 72.98400000000001}, {"type": "recall_at_20", "value": 72.98400000000001}, {"type": "recall_at_3", "value": 56.213}, {"type": "recall_at_5", "value": 64.318}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackEnglishRetrieval", "type": "mteb/cqadupstack-english", "config": "default", "split": "test", "revision": "ad9991cb51e31e31e430383c75ffb2885547b5f0"}, "metrics": [{"type": "map_at_1", "value": 37.157000000000004}, {"type": "map_at_10", "value": 49.512}, {"type": "map_at_100", "value": 49.512}, {"type": "map_at_1000", "value": 49.512}, {"type": "map_at_20", "value": 49.512}, {"type": "map_at_3", "value": 46.099000000000004}, {"type": "map_at_5", "value": 48.061}, {"type": "mrr_at_1", "value": 47.516000000000005}, {"type": "mrr_at_10", "value": 55.803999999999995}, {"type": "mrr_at_100", "value": 55.803999999999995}, {"type": "mrr_at_1000", "value": 55.803999999999995}, {"type": "mrr_at_20", "value": 55.803999999999995}, {"type": "mrr_at_3", "value": 53.885000000000005}, {"type": "mrr_at_5", "value": 54.967999999999996}, {"type": "ndcg_at_1", "value": 47.516000000000005}, {"type": "ndcg_at_10", "value": 55.386}, {"type": "ndcg_at_100", "value": 54.952}, {"type": "ndcg_at_1000", "value": 54.952}, {"type": "ndcg_at_20", "value": 55.07300000000001}, {"type": "ndcg_at_3", "value": 51.458000000000006}, {"type": "ndcg_at_5", "value": 53.189}, {"type": "precision_at_1", "value": 47.516000000000005}, {"type": "precision_at_10", "value": 10.567}, {"type": "precision_at_100", "value": 1.057}, {"type": "precision_at_1000", "value": 0.106}, {"type": "precision_at_20", "value": 5.283}, {"type": "precision_at_3", "value": 25.393}, {"type": "precision_at_5", "value": 17.656}, {"type": "recall_at_1", "value": 37.157000000000004}, {"type": "recall_at_10", "value": 65.026}, {"type": "recall_at_100", "value": 65.026}, {"type": "recall_at_1000", "value": 65.026}, {"type": "recall_at_20", "value": 65.026}, {"type": "recall_at_3", "value": 52.36300000000001}, {"type": "recall_at_5", "value": 57.989999999999995}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGamingRetrieval", "type": "mteb/cqadupstack-gaming", "config": "default", "split": "test", "revision": "4885aa143210c98657558c04aaf3dc47cfb54340"}, "metrics": [{"type": "map_at_1", "value": 48.522999999999996}, {"type": "map_at_10", "value": 62.844}, {"type": "map_at_100", "value": 62.844}, {"type": "map_at_1000", "value": 62.844}, {"type": "map_at_20", "value": 62.844}, {"type": "map_at_3", "value": 59.150999999999996}, {"type": "map_at_5", "value": 61.403}, {"type": "mrr_at_1", "value": 55.925000000000004}, {"type": "mrr_at_10", "value": 66.113}, {"type": "mrr_at_100", "value": 66.113}, {"type": "mrr_at_1000", "value": 66.113}, {"type": "mrr_at_20", "value": 66.113}, {"type": "mrr_at_3", "value": 63.783}, {"type": "mrr_at_5", "value": 65.212}, {"type": "ndcg_at_1", "value": 55.925000000000004}, {"type": "ndcg_at_10", "value": 68.869}, {"type": "ndcg_at_100", "value": 68.774}, {"type": "ndcg_at_1000", "value": 68.774}, {"type": "ndcg_at_20", "value": 68.777}, {"type": "ndcg_at_3", "value": 63.31400000000001}, {"type": "ndcg_at_5", "value": 66.247}, {"type": "precision_at_1", "value": 55.925000000000004}, {"type": "precision_at_10", "value": 10.997}, {"type": "precision_at_100", "value": 1.0999999999999999}, {"type": "precision_at_1000", "value": 0.11}, {"type": "precision_at_20", "value": 5.498}, {"type": "precision_at_3", "value": 28.359}, {"type": "precision_at_5", "value": 19.386}, {"type": "recall_at_1", "value": 48.522999999999996}, {"type": "recall_at_10", "value": 83.045}, {"type": "recall_at_100", "value": 83.045}, {"type": "recall_at_1000", "value": 83.045}, {"type": "recall_at_20", "value": 83.045}, {"type": "recall_at_3", "value": 68.449}, {"type": "recall_at_5", "value": 75.62100000000001}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGisRetrieval", "type": "mteb/cqadupstack-gis", "config": "default", "split": "test", "revision": "5003b3064772da1887988e05400cf3806fe491f2"}, "metrics": [{"type": "map_at_1", "value": 30.726}, {"type": "map_at_10", "value": 40.433}, {"type": "map_at_100", "value": 40.433}, {"type": "map_at_1000", "value": 40.433}, {"type": "map_at_20", "value": 40.433}, {"type": "map_at_3", "value": 37.135}, {"type": "map_at_5", "value": 39.17}, {"type": "mrr_at_1", "value": 33.672000000000004}, {"type": "mrr_at_10", "value": 42.836}, {"type": "mrr_at_100", "value": 42.836}, {"type": "mrr_at_1000", "value": 42.836}, {"type": "mrr_at_20", "value": 42.836}, {"type": "mrr_at_3", "value": 39.755}, {"type": "mrr_at_5", "value": 41.631}, {"type": "ndcg_at_1", "value": 33.672000000000004}, {"type": "ndcg_at_10", "value": 46.092}, {"type": "ndcg_at_100", "value": 46.092}, {"type": "ndcg_at_1000", "value": 46.092}, {"type": "ndcg_at_20", "value": 46.092}, {"type": "ndcg_at_3", "value": 39.797}, {"type": "ndcg_at_5", "value": 43.171}, {"type": "precision_at_1", "value": 33.672000000000004}, {"type": "precision_at_10", "value": 7.073}, {"type": "precision_at_100", "value": 0.707}, {"type": "precision_at_1000", "value": 0.07100000000000001}, {"type": "precision_at_20", "value": 3.537}, {"type": "precision_at_3", "value": 16.648}, {"type": "precision_at_5", "value": 11.91}, {"type": "recall_at_1", "value": 30.726}, {"type": "recall_at_10", "value": 61.24000000000001}, {"type": "recall_at_100", "value": 61.24000000000001}, {"type": "recall_at_1000", "value": 61.24000000000001}, {"type": "recall_at_20", "value": 61.24000000000001}, {"type": "recall_at_3", "value": 44.557}, {"type": "recall_at_5", "value": 52.608999999999995}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackMathematicaRetrieval", "type": "mteb/cqadupstack-mathematica", "config": "default", "split": "test", "revision": "90fceea13679c63fe563ded68f3b6f06e50061de"}, "metrics": [{"type": "map_at_1", "value": 21.554000000000002}, {"type": "map_at_10", "value": 31.508000000000003}, {"type": "map_at_100", "value": 31.508000000000003}, {"type": "map_at_1000", "value": 31.508000000000003}, {"type": "map_at_20", "value": 31.508000000000003}, {"type": "map_at_3", "value": 28.225}, {"type": "map_at_5", "value": 30.043}, {"type": "mrr_at_1", "value": 27.114}, {"type": "mrr_at_10", "value": 36.631}, {"type": "mrr_at_100", "value": 36.631}, {"type": "mrr_at_1000", "value": 36.631}, {"type": "mrr_at_20", "value": 36.631}, {"type": "mrr_at_3", "value": 34.059}, {"type": "mrr_at_5", "value": 35.601}, {"type": "ndcg_at_1", "value": 27.114}, {"type": "ndcg_at_10", "value": 37.592999999999996}, {"type": "ndcg_at_100", "value": 37.588}, {"type": "ndcg_at_1000", "value": 37.588}, {"type": "ndcg_at_20", "value": 37.588}, {"type": "ndcg_at_3", "value": 32.038}, {"type": "ndcg_at_5", "value": 34.689}, {"type": "precision_at_1", "value": 27.114}, {"type": "precision_at_10", "value": 7.090000000000001}, {"type": "precision_at_100", "value": 0.709}, {"type": "precision_at_1000", "value": 0.07100000000000001}, {"type": "precision_at_20", "value": 3.5450000000000004}, {"type": "precision_at_3", "value": 15.506}, {"type": "precision_at_5", "value": 11.393}, {"type": "recall_at_1", "value": 21.554000000000002}, {"type": "recall_at_10", "value": 50.879}, {"type": "recall_at_100", "value": 50.879}, {"type": "recall_at_1000", "value": 50.879}, {"type": "recall_at_20", "value": 50.879}, {"type": "recall_at_3", "value": 35.827999999999996}, {"type": "recall_at_5", "value": 42.476}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackPhysicsRetrieval", "type": "mteb/cqadupstack-physics", "config": "default", "split": "test", "revision": "79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4"}, "metrics": [{"type": "map_at_1", "value": 35.36}, {"type": "map_at_10", "value": 48.483}, {"type": "map_at_100", "value": 48.483}, {"type": "map_at_1000", "value": 48.483}, {"type": "map_at_20", "value": 48.483}, {"type": "map_at_3", "value": 44.639}, {"type": "map_at_5", "value": 46.698}, {"type": "mrr_at_1", "value": 43.985}, {"type": "mrr_at_10", "value": 54.039}, {"type": "mrr_at_100", "value": 54.039}, {"type": "mrr_at_1000", "value": 54.039}, {"type": "mrr_at_20", "value": 54.039}, {"type": "mrr_at_3", "value": 51.54}, {"type": "mrr_at_5", "value": 52.859}, {"type": "ndcg_at_1", "value": 43.985}, {"type": "ndcg_at_10", "value": 55.069}, {"type": "ndcg_at_100", "value": 54.967}, {"type": "ndcg_at_1000", "value": 54.967}, {"type": "ndcg_at_20", "value": 54.996}, {"type": "ndcg_at_3", "value": 49.544}, {"type": "ndcg_at_5", "value": 51.932}, {"type": "precision_at_1", "value": 43.985}, {"type": "precision_at_10", "value": 10.202}, {"type": "precision_at_100", "value": 1.02}, {"type": "precision_at_1000", "value": 0.10200000000000001}, {"type": "precision_at_20", "value": 5.101}, {"type": "precision_at_3", "value": 23.933}, {"type": "precision_at_5", "value": 16.901}, {"type": "recall_at_1", "value": 35.36}, {"type": "recall_at_10", "value": 68.806}, {"type": "recall_at_100", "value": 68.806}, {"type": "recall_at_1000", "value": 68.806}, {"type": "recall_at_20", "value": 68.806}, {"type": "recall_at_3", "value": 52.714000000000006}, {"type": "recall_at_5", "value": 59.168}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackProgrammersRetrieval", "type": "mteb/cqadupstack-programmers", "config": "default", "split": "test", "revision": "6184bc1440d2dbc7612be22b50686b8826d22b32"}, "metrics": [{"type": "map_at_1", "value": 32.431}, {"type": "map_at_10", "value": 45.421}, {"type": "map_at_100", "value": 45.421}, {"type": "map_at_1000", "value": 45.421}, {"type": "map_at_20", "value": 45.421}, {"type": "map_at_3", "value": 41.82}, {"type": "map_at_5", "value": 43.692}, {"type": "mrr_at_1", "value": 41.096}, {"type": "mrr_at_10", "value": 51.293}, {"type": "mrr_at_100", "value": 51.293}, {"type": "mrr_at_1000", "value": 51.293}, {"type": "mrr_at_20", "value": 51.293}, {"type": "mrr_at_3", "value": 49.049}, {"type": "mrr_at_5", "value": 50.327}, {"type": "ndcg_at_1", "value": 41.096}, {"type": "ndcg_at_10", "value": 52.032999999999994}, {"type": "ndcg_at_100", "value": 51.903}, {"type": "ndcg_at_1000", "value": 51.897999999999996}, {"type": "ndcg_at_20", "value": 51.942}, {"type": "ndcg_at_3", "value": 47.024}, {"type": "ndcg_at_5", "value": 49.071}, {"type": "precision_at_1", "value": 41.096}, {"type": "precision_at_10", "value": 9.725999999999999}, {"type": "precision_at_100", "value": 0.9730000000000001}, {"type": "precision_at_1000", "value": 0.097}, {"type": "precision_at_20", "value": 4.8629999999999995}, {"type": "precision_at_3", "value": 23.097}, {"type": "precision_at_5", "value": 16.096}, {"type": "recall_at_1", "value": 32.431}, {"type": "recall_at_10", "value": 65.42999999999999}, {"type": "recall_at_100", "value": 65.42999999999999}, {"type": "recall_at_1000", "value": 65.42999999999999}, {"type": "recall_at_20", "value": 65.42999999999999}, {"type": "recall_at_3", "value": 50.856}, {"type": "recall_at_5", "value": 56.846}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackRetrieval", "type": "mteb/cqadupstack", "config": "default", "split": "test", "revision": "ad9991cb51e31e31e430383c75ffb2885547b5f0"}, "metrics": [{"type": "map_at_1", "value": 32.074749999999995}, {"type": "map_at_10", "value": 43.474}, {"type": "map_at_100", "value": 43.474}, {"type": "map_at_1000", "value": 43.474}, {"type": "map_at_20", "value": 43.474}, {"type": "map_at_3", "value": 40.10458333333333}, {"type": "map_at_5", "value": 42.010749999999994}, {"type": "mrr_at_1", "value": 38.60425}, {"type": "mrr_at_10", "value": 48.05550000000001}, {"type": "mrr_at_100", "value": 48.05550000000001}, {"type": "mrr_at_1000", "value": 48.05550000000001}, {"type": "mrr_at_20", "value": 48.05550000000001}, {"type": "mrr_at_3", "value": 45.58083333333334}, {"type": "mrr_at_5", "value": 47.04750000000001}, {"type": "ndcg_at_1", "value": 38.60425}, {"type": "ndcg_at_10", "value": 49.51958333333334}, {"type": "ndcg_at_100", "value": 49.3385}, {"type": "ndcg_at_1000", "value": 49.33491666666667}, {"type": "ndcg_at_20", "value": 49.393}, {"type": "ndcg_at_3", "value": 44.32699999999999}, {"type": "ndcg_at_5", "value": 46.81008333333333}, {"type": "precision_at_1", "value": 38.60425}, {"type": "precision_at_10", "value": 8.800666666666668}, {"type": "precision_at_100", "value": 0.8800833333333334}, {"type": "precision_at_1000", "value": 0.08808333333333335}, {"type": "precision_at_20", "value": 4.400333333333334}, {"type": "precision_at_3", "value": 20.723166666666664}, {"type": "precision_at_5", "value": 14.65683333333333}, {"type": "recall_at_1", "value": 32.074749999999995}, {"type": "recall_at_10", "value": 62.5025}, {"type": "recall_at_100", "value": 62.5025}, {"type": "recall_at_1000", "value": 62.5025}, {"type": "recall_at_20", "value": 62.5025}, {"type": "recall_at_3", "value": 47.81091666666667}, {"type": "recall_at_5", "value": 54.38974999999999}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackStatsRetrieval", "type": "mteb/cqadupstack-stats", "config": "default", "split": "test", "revision": "65ac3a16b8e91f9cee4c9828cc7c335575432a2a"}, "metrics": [{"type": "map_at_1", "value": 28.758}, {"type": "map_at_10", "value": 37.633}, {"type": "map_at_100", "value": 37.633}, {"type": "map_at_1000", "value": 37.633}, {"type": "map_at_20", "value": 37.633}, {"type": "map_at_3", "value": 34.865}, {"type": "map_at_5", "value": 36.437999999999995}, {"type": "mrr_at_1", "value": 32.208999999999996}, {"type": "mrr_at_10", "value": 40.598}, {"type": "mrr_at_100", "value": 40.598}, {"type": "mrr_at_1000", "value": 40.598}, {"type": "mrr_at_20", "value": 40.598}, {"type": "mrr_at_3", "value": 37.935}, {"type": "mrr_at_5", "value": 39.476}, {"type": "ndcg_at_1", "value": 32.208999999999996}, {"type": "ndcg_at_10", "value": 42.798}, {"type": "ndcg_at_100", "value": 42.768}, {"type": "ndcg_at_1000", "value": 42.768}, {"type": "ndcg_at_20", "value": 42.768}, {"type": "ndcg_at_3", "value": 37.651}, {"type": "ndcg_at_5", "value": 40.172999999999995}, {"type": "precision_at_1", "value": 32.208999999999996}, {"type": "precision_at_10", "value": 6.84}, {"type": "precision_at_100", "value": 0.6839999999999999}, {"type": "precision_at_1000", "value": 0.068}, {"type": "precision_at_20", "value": 3.42}, {"type": "precision_at_3", "value": 16.258}, {"type": "precision_at_5", "value": 11.472}, {"type": "recall_at_1", "value": 28.758}, {"type": "recall_at_10", "value": 55.55799999999999}, {"type": "recall_at_100", "value": 55.55799999999999}, {"type": "recall_at_1000", "value": 55.55799999999999}, {"type": "recall_at_20", "value": 55.55799999999999}, {"type": "recall_at_3", "value": 41.488}, {"type": "recall_at_5", "value": 47.659}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackTexRetrieval", "type": "mteb/cqadupstack-tex", "config": "default", "split": "test", "revision": "46989137a86843e03a6195de44b09deda022eec7"}, "metrics": [{"type": "map_at_1", "value": 21.088}, {"type": "map_at_10", "value": 30.297}, {"type": "map_at_100", "value": 30.297}, {"type": "map_at_1000", "value": 30.297}, {"type": "map_at_20", "value": 30.297}, {"type": "map_at_3", "value": 27.376}, {"type": "map_at_5", "value": 29.064}, {"type": "mrr_at_1", "value": 26.358999999999998}, {"type": "mrr_at_10", "value": 34.996}, {"type": "mrr_at_100", "value": 34.996}, {"type": "mrr_at_1000", "value": 34.996}, {"type": "mrr_at_20", "value": 34.996}, {"type": "mrr_at_3", "value": 32.467}, {"type": "mrr_at_5", "value": 33.944}, {"type": "ndcg_at_1", "value": 26.358999999999998}, {"type": "ndcg_at_10", "value": 35.851}, {"type": "ndcg_at_100", "value": 35.731}, {"type": "ndcg_at_1000", "value": 35.729}, {"type": "ndcg_at_20", "value": 35.77}, {"type": "ndcg_at_3", "value": 30.97}, {"type": "ndcg_at_5", "value": 33.312000000000005}, {"type": "precision_at_1", "value": 26.358999999999998}, {"type": "precision_at_10", "value": 6.641}, {"type": "precision_at_100", "value": 0.664}, {"type": "precision_at_1000", "value": 0.066}, {"type": "precision_at_20", "value": 3.321}, {"type": "precision_at_3", "value": 14.923}, {"type": "precision_at_5", "value": 10.86}, {"type": "recall_at_1", "value": 21.088}, {"type": "recall_at_10", "value": 47.818}, {"type": "recall_at_100", "value": 47.818}, {"type": "recall_at_1000", "value": 47.818}, {"type": "recall_at_20", "value": 47.818}, {"type": "recall_at_3", "value": 33.815}, {"type": "recall_at_5", "value": 39.973}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackUnixRetrieval", "type": "mteb/cqadupstack-unix", "config": "default", "split": "test", "revision": "6c6430d3a6d36f8d2a829195bc5dc94d7e063e53"}, "metrics": [{"type": "map_at_1", "value": 33.579}, {"type": "map_at_10", "value": 44.875}, {"type": "map_at_100", "value": 44.875}, {"type": "map_at_1000", "value": 44.875}, {"type": "map_at_20", "value": 44.875}, {"type": "map_at_3", "value": 41.64}, {"type": "map_at_5", "value": 43.433}, {"type": "mrr_at_1", "value": 40.111999999999995}, {"type": "mrr_at_10", "value": 49.586999999999996}, {"type": "mrr_at_100", "value": 49.586999999999996}, {"type": "mrr_at_1000", "value": 49.586999999999996}, {"type": "mrr_at_20", "value": 49.586999999999996}, {"type": "mrr_at_3", "value": 47.233000000000004}, {"type": "mrr_at_5", "value": 48.613}, {"type": "ndcg_at_1", "value": 40.111999999999995}, {"type": "ndcg_at_10", "value": 50.836000000000006}, {"type": "ndcg_at_100", "value": 50.822}, {"type": "ndcg_at_1000", "value": 50.822}, {"type": "ndcg_at_20", "value": 50.822}, {"type": "ndcg_at_3", "value": 45.737}, {"type": "ndcg_at_5", "value": 48.081}, {"type": "precision_at_1", "value": 40.111999999999995}, {"type": "precision_at_10", "value": 8.674999999999999}, {"type": "precision_at_100", "value": 0.868}, {"type": "precision_at_1000", "value": 0.087}, {"type": "precision_at_20", "value": 4.338}, {"type": "precision_at_3", "value": 21.02}, {"type": "precision_at_5", "value": 14.682999999999998}, {"type": "recall_at_1", "value": 33.579}, {"type": "recall_at_10", "value": 64.02600000000001}, {"type": "recall_at_100", "value": 64.02600000000001}, {"type": "recall_at_1000", "value": 64.02600000000001}, {"type": "recall_at_20", "value": 64.02600000000001}, {"type": "recall_at_3", "value": 49.788}, {"type": "recall_at_5", "value": 55.931}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWebmastersRetrieval", "type": "mteb/cqadupstack-webmasters", "config": "default", "split": "test", "revision": "160c094312a0e1facb97e55eeddb698c0abe3571"}, "metrics": [{"type": "map_at_1", "value": 31.497999999999998}, {"type": "map_at_10", "value": 43.456}, {"type": "map_at_100", "value": 43.456}, {"type": "map_at_1000", "value": 43.456}, {"type": "map_at_20", "value": 43.456}, {"type": "map_at_3", "value": 40.125}, {"type": "map_at_5", "value": 41.829}, {"type": "mrr_at_1", "value": 38.735}, {"type": "mrr_at_10", "value": 48.756}, {"type": "mrr_at_100", "value": 48.756}, {"type": "mrr_at_1000", "value": 48.756}, {"type": "mrr_at_20", "value": 48.756}, {"type": "mrr_at_3", "value": 46.113}, {"type": "mrr_at_5", "value": 47.684}, {"type": "ndcg_at_1", "value": 38.735}, {"type": "ndcg_at_10", "value": 50.241}, {"type": "ndcg_at_100", "value": 49.458}, {"type": "ndcg_at_1000", "value": 49.437999999999995}, {"type": "ndcg_at_20", "value": 49.756}, {"type": "ndcg_at_3", "value": 45.14}, {"type": "ndcg_at_5", "value": 47.406}, {"type": "precision_at_1", "value": 38.735}, {"type": "precision_at_10", "value": 9.763}, {"type": "precision_at_100", "value": 0.976}, {"type": "precision_at_1000", "value": 0.098}, {"type": "precision_at_20", "value": 4.881}, {"type": "precision_at_3", "value": 21.673000000000002}, {"type": "precision_at_5", "value": 15.455}, {"type": "recall_at_1", "value": 31.497999999999998}, {"type": "recall_at_10", "value": 62.568999999999996}, {"type": "recall_at_100", "value": 62.568999999999996}, {"type": "recall_at_1000", "value": 62.568999999999996}, {"type": "recall_at_20", "value": 62.568999999999996}, {"type": "recall_at_3", "value": 47.842}, {"type": "recall_at_5", "value": 54.159}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWordpressRetrieval", "type": "mteb/cqadupstack-wordpress", "config": "default", "split": "test", "revision": "4ffe81d471b1924886b33c7567bfb200e9eec5c4"}, "metrics": [{"type": "map_at_1", "value": 24.991}, {"type": "map_at_10", "value": 34.183}, {"type": "map_at_100", "value": 34.183}, {"type": "map_at_1000", "value": 34.183}, {"type": "map_at_20", "value": 34.183}, {"type": "map_at_3", "value": 31.592}, {"type": "map_at_5", "value": 33.121}, {"type": "mrr_at_1", "value": 27.172}, {"type": "mrr_at_10", "value": 36.463}, {"type": "mrr_at_100", "value": 36.463}, {"type": "mrr_at_1000", "value": 36.463}, {"type": "mrr_at_20", "value": 36.463}, {"type": "mrr_at_3", "value": 34.165}, {"type": "mrr_at_5", "value": 35.616}, {"type": "ndcg_at_1", "value": 27.172}, {"type": "ndcg_at_10", "value": 39.311}, {"type": "ndcg_at_100", "value": 39.292}, {"type": "ndcg_at_1000", "value": 39.292}, {"type": "ndcg_at_20", "value": 39.301}, {"type": "ndcg_at_3", "value": 34.498}, {"type": "ndcg_at_5", "value": 37.006}, {"type": "precision_at_1", "value": 27.172}, {"type": "precision_at_10", "value": 6.174}, {"type": "precision_at_100", "value": 0.617}, {"type": "precision_at_1000", "value": 0.062}, {"type": "precision_at_20", "value": 3.087}, {"type": "precision_at_3", "value": 14.972}, {"type": "precision_at_5", "value": 10.499}, {"type": "recall_at_1", "value": 24.991}, {"type": "recall_at_10", "value": 52.649}, {"type": "recall_at_100", "value": 52.649}, {"type": "recall_at_1000", "value": 52.649}, {"type": "recall_at_20", "value": 52.649}, {"type": "recall_at_3", "value": 39.818}, {"type": "recall_at_5", "value": 45.927}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ClimateFEVER", "type": "mteb/climate-fever", "config": "default", "split": "test", "revision": "47f2ac6acb640fc46020b02a5b59fdda04d39380"}, "metrics": [{"type": "map_at_1", "value": 12.475999999999999}, {"type": "map_at_10", "value": 22.999}, {"type": "map_at_100", "value": 22.999}, {"type": "map_at_1000", "value": 22.999}, {"type": "map_at_20", "value": 22.999}, {"type": "map_at_3", "value": 18.804000000000002}, {"type": "map_at_5", "value": 20.987000000000002}, {"type": "mrr_at_1", "value": 28.404}, {"type": "mrr_at_10", "value": 42.335}, {"type": "mrr_at_100", "value": 42.335}, {"type": "mrr_at_1000", "value": 42.335}, {"type": "mrr_at_20", "value": 42.335}, {"type": "mrr_at_3", "value": 39.11}, {"type": "mrr_at_5", "value": 40.953}, {"type": "ndcg_at_1", "value": 28.404}, {"type": "ndcg_at_10", "value": 32.467}, {"type": "ndcg_at_100", "value": 32.467}, {"type": "ndcg_at_1000", "value": 32.467}, {"type": "ndcg_at_20", "value": 32.467}, {"type": "ndcg_at_3", "value": 26.334999999999997}, {"type": "ndcg_at_5", "value": 28.493000000000002}, {"type": "precision_at_1", "value": 28.404}, {"type": "precision_at_10", "value": 10.43}, {"type": "precision_at_100", "value": 1.043}, {"type": "precision_at_1000", "value": 0.104}, {"type": "precision_at_20", "value": 5.215}, {"type": "precision_at_3", "value": 20.13}, {"type": "precision_at_5", "value": 15.595999999999998}, {"type": "recall_at_1", "value": 12.475999999999999}, {"type": "recall_at_10", "value": 39.757}, {"type": "recall_at_100", "value": 39.757}, {"type": "recall_at_1000", "value": 39.757}, {"type": "recall_at_20", "value": 39.757}, {"type": "recall_at_3", "value": 24.695}, {"type": "recall_at_5", "value": 30.864000000000004}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB DBPedia", "type": "mteb/dbpedia", "config": "default", "split": "test", "revision": "c0f706b76e590d620bd6618b3ca8efdd34e2d659"}, "metrics": [{"type": "map_at_1", "value": 9.261999999999999}, {"type": "map_at_10", "value": 23.807000000000002}, {"type": "map_at_100", "value": 23.807000000000002}, {"type": "map_at_1000", "value": 23.807000000000002}, {"type": "map_at_20", "value": 23.807000000000002}, {"type": "map_at_3", "value": 15.776000000000002}, {"type": "map_at_5", "value": 19.17}, {"type": "mrr_at_1", "value": 71.75}, {"type": "mrr_at_10", "value": 79.959}, {"type": "mrr_at_100", "value": 79.959}, {"type": "mrr_at_1000", "value": 79.959}, {"type": "mrr_at_20", "value": 79.959}, {"type": "mrr_at_3", "value": 78.625}, {"type": "mrr_at_5", "value": 79.412}, {"type": "ndcg_at_1", "value": 59.5}, {"type": "ndcg_at_10", "value": 48.988}, {"type": "ndcg_at_100", "value": 37.452000000000005}, {"type": "ndcg_at_1000", "value": 37.32}, {"type": "ndcg_at_20", "value": 41.387}, {"type": "ndcg_at_3", "value": 52.567}, {"type": "ndcg_at_5", "value": 50.649}, {"type": "precision_at_1", "value": 71.75}, {"type": "precision_at_10", "value": 40.425}, {"type": "precision_at_100", "value": 4.042}, {"type": "precision_at_1000", "value": 0.404}, {"type": "precision_at_20", "value": 20.212}, {"type": "precision_at_3", "value": 57.75}, {"type": "precision_at_5", "value": 50.349999999999994}, {"type": "recall_at_1", "value": 9.261999999999999}, {"type": "recall_at_10", "value": 30.329}, {"type": "recall_at_100", "value": 30.329}, {"type": "recall_at_1000", "value": 30.329}, {"type": "recall_at_20", "value": 30.329}, {"type": "recall_at_3", "value": 17.422}, {"type": "recall_at_5", "value": 22.598}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB EmotionClassification", "type": "mteb/emotion", "config": "default", "split": "test", "revision": "4f58c6b202a23cf9a4da393831edf4f9183cad37"}, "metrics": [{"type": "accuracy", "value": 52.014999999999986}, {"type": "f1", "value": 47.33036786740981}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FEVER", "type": "mteb/fever", "config": "default", "split": "test", "revision": "bea83ef9e8fb933d90a2f1d5515737465d613e12"}, "metrics": [{"type": "map_at_1", "value": 82.00800000000001}, {"type": "map_at_10", "value": 88.02799999999999}, {"type": "map_at_100", "value": 88.02799999999999}, {"type": "map_at_1000", "value": 88.02799999999999}, {"type": "map_at_20", "value": 88.02799999999999}, {"type": "map_at_3", "value": 87.249}, {"type": "map_at_5", "value": 87.78399999999999}, {"type": "mrr_at_1", "value": 88.299}, {"type": "mrr_at_10", "value": 92.92}, {"type": "mrr_at_100", "value": 92.92}, {"type": "mrr_at_1000", "value": 92.92}, {"type": "mrr_at_20", "value": 92.92}, {"type": "mrr_at_3", "value": 92.56400000000001}, {"type": "mrr_at_5", "value": 92.83200000000001}, {"type": "ndcg_at_1", "value": 88.299}, {"type": "ndcg_at_10", "value": 90.88000000000001}, {"type": "ndcg_at_100", "value": 90.879}, {"type": "ndcg_at_1000", "value": 90.879}, {"type": "ndcg_at_20", "value": 90.879}, {"type": "ndcg_at_3", "value": 89.85499999999999}, {"type": "ndcg_at_5", "value": 90.485}, {"type": "precision_at_1", "value": 88.299}, {"type": "precision_at_10", "value": 10.522}, {"type": "precision_at_100", "value": 1.052}, {"type": "precision_at_1000", "value": 0.105}, {"type": "precision_at_20", "value": 5.261}, {"type": "precision_at_3", "value": 33.573}, {"type": "precision_at_5", "value": 20.633000000000003}, {"type": "recall_at_1", "value": 82.00800000000001}, {"type": "recall_at_10", "value": 94.952}, {"type": "recall_at_100", "value": 94.952}, {"type": "recall_at_1000", "value": 94.952}, {"type": "recall_at_20", "value": 94.952}, {"type": "recall_at_3", "value": 92.089}, {"type": "recall_at_5", "value": 93.794}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FiQA2018", "type": "mteb/fiqa", "config": "default", "split": "test", "revision": "27a168819829fe9bcd655c2df245fb19452e8e06"}, "metrics": [{"type": "map_at_1", "value": 26.857}, {"type": "map_at_10", "value": 44.645}, {"type": "map_at_100", "value": 44.645}, {"type": "map_at_1000", "value": 44.645}, {"type": "map_at_20", "value": 44.645}, {"type": "map_at_3", "value": 38.166}, {"type": "map_at_5", "value": 41.992000000000004}, {"type": "mrr_at_1", "value": 50.309000000000005}, {"type": "mrr_at_10", "value": 59.59100000000001}, {"type": "mrr_at_100", "value": 59.59100000000001}, {"type": "mrr_at_1000", "value": 59.59100000000001}, {"type": "mrr_at_20", "value": 59.59100000000001}, {"type": "mrr_at_3", "value": 56.97}, {"type": "mrr_at_5", "value": 58.498000000000005}, {"type": "ndcg_at_1", "value": 50.309000000000005}, {"type": "ndcg_at_10", "value": 53.221}, {"type": "ndcg_at_100", "value": 53.15800000000001}, {"type": "ndcg_at_1000", "value": 53.15800000000001}, {"type": "ndcg_at_20", "value": 53.15800000000001}, {"type": "ndcg_at_3", "value": 47.506}, {"type": "ndcg_at_5", "value": 49.922}, {"type": "precision_at_1", "value": 50.309000000000005}, {"type": "precision_at_10", "value": 14.985000000000001}, {"type": "precision_at_100", "value": 1.498}, {"type": "precision_at_1000", "value": 0.15}, {"type": "precision_at_20", "value": 7.492}, {"type": "precision_at_3", "value": 31.635999999999996}, {"type": "precision_at_5", "value": 24.043}, {"type": "recall_at_1", "value": 26.857}, {"type": "recall_at_10", "value": 62.051}, {"type": "recall_at_100", "value": 62.051}, {"type": "recall_at_1000", "value": 62.051}, {"type": "recall_at_20", "value": 62.051}, {"type": "recall_at_3", "value": 42.966}, {"type": "recall_at_5", "value": 51.943}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB HotpotQA", "type": "mteb/hotpotqa", "config": "default", "split": "test", "revision": "ab518f4d6fcca38d87c25209f94beba119d02014"}, "metrics": [{"type": "map_at_1", "value": 40.891}, {"type": "map_at_10", "value": 70.431}, {"type": "map_at_100", "value": 70.431}, {"type": "map_at_1000", "value": 70.431}, {"type": "map_at_20", "value": 70.431}, {"type": "map_at_3", "value": 66.704}, {"type": "map_at_5", "value": 69.179}, {"type": "mrr_at_1", "value": 81.783}, {"type": "mrr_at_10", "value": 87.368}, {"type": "mrr_at_100", "value": 87.368}, {"type": "mrr_at_1000", "value": 87.368}, {"type": "mrr_at_20", "value": 87.368}, {"type": "mrr_at_3", "value": 86.59700000000001}, {"type": "mrr_at_5", "value": 87.128}, {"type": "ndcg_at_1", "value": 81.783}, {"type": "ndcg_at_10", "value": 77.697}, {"type": "ndcg_at_100", "value": 77.697}, {"type": "ndcg_at_1000", "value": 77.697}, {"type": "ndcg_at_20", "value": 77.697}, {"type": "ndcg_at_3", "value": 72.688}, {"type": "ndcg_at_5", "value": 75.69200000000001}, {"type": "precision_at_1", "value": 81.783}, {"type": "precision_at_10", "value": 16.488}, {"type": "precision_at_100", "value": 1.649}, {"type": "precision_at_1000", "value": 0.165}, {"type": "precision_at_20", "value": 8.244}, {"type": "precision_at_3", "value": 47.693000000000005}, {"type": "precision_at_5", "value": 30.976}, {"type": "recall_at_1", "value": 40.891}, {"type": "recall_at_10", "value": 82.438}, {"type": "recall_at_100", "value": 82.438}, {"type": "recall_at_1000", "value": 82.438}, {"type": "recall_at_20", "value": 82.438}, {"type": "recall_at_3", "value": 71.54}, {"type": "recall_at_5", "value": 77.441}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ImdbClassification", "type": "mteb/imdb", "config": "default", "split": "test", "revision": "3d86128a09e091d6018b6d26cad27f2739fc2db7"}, "metrics": [{"type": "accuracy", "value": 89.47240000000001}, {"type": "ap", "value": 85.75618304701787}, {"type": "f1", "value": 89.44156774176075}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MSMARCO", "type": "mteb/msmarco", "config": "default", "split": "dev", "revision": "c5a29a104738b98a9e76336939199e264163d4a0"}, "metrics": [{"type": "map_at_1", "value": 19.941}, {"type": "map_at_10", "value": 33.108}, {"type": "map_at_100", "value": 33.108}, {"type": "map_at_1000", "value": 33.108}, {"type": "map_at_20", "value": 33.108}, {"type": "map_at_3", "value": 28.716}, {"type": "map_at_5", "value": 31.255}, {"type": "mrr_at_1", "value": 20.458000000000002}, {"type": "mrr_at_10", "value": 33.646}, {"type": "mrr_at_100", "value": 33.646}, {"type": "mrr_at_1000", "value": 33.646}, {"type": "mrr_at_20", "value": 33.646}, {"type": "mrr_at_3", "value": 29.360000000000003}, {"type": "mrr_at_5", "value": 31.849}, {"type": "ndcg_at_1", "value": 20.458000000000002}, {"type": "ndcg_at_10", "value": 40.664}, {"type": "ndcg_at_100", "value": 40.664}, {"type": "ndcg_at_1000", "value": 40.664}, {"type": "ndcg_at_20", "value": 40.664}, {"type": "ndcg_at_3", "value": 31.733}, {"type": "ndcg_at_5", "value": 36.266999999999996}, {"type": "precision_at_1", "value": 20.458000000000002}, {"type": "precision_at_10", "value": 6.703}, {"type": "precision_at_100", "value": 0.67}, {"type": "precision_at_1000", "value": 0.067}, {"type": "precision_at_20", "value": 3.3520000000000003}, {"type": "precision_at_3", "value": 13.777000000000001}, {"type": "precision_at_5", "value": 10.564}, {"type": "recall_at_1", "value": 19.941}, {"type": "recall_at_10", "value": 64.103}, {"type": "recall_at_100", "value": 64.103}, {"type": "recall_at_1000", "value": 64.103}, {"type": "recall_at_20", "value": 64.103}, {"type": "recall_at_3", "value": 39.800999999999995}, {"type": "recall_at_5", "value": 50.727999999999994}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPDomainClassification (en)", "type": "mteb/mtop_domain", "config": "en", "split": "test", "revision": "d80d48c1eb48d3562165c59d59d0034df9fff0bf"}, "metrics": [{"type": "accuracy", "value": 96.45690834473322}, {"type": "f1", "value": 96.19980363353172}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPIntentClassification (en)", "type": "mteb/mtop_intent", "config": "en", "split": "test", "revision": "ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba"}, "metrics": [{"type": "accuracy", "value": 85.38075695394436}, {"type": "f1", "value": 71.33409850817071}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (en)", "type": "mteb/amazon_massive_intent", "config": "en", "split": "test", "revision": "31efe3c427b0bae9c22cbb560b8f15491cc6bed7"}, "metrics": [{"type": "accuracy", "value": 80.12104909213183}, {"type": "f1", "value": 77.26691038674358}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (en)", "type": "mteb/amazon_massive_scenario", "config": "en", "split": "test", "revision": "7d571f92784cd94a019292a1f45445077d0ef634"}, "metrics": [{"type": "accuracy", "value": 82.69670477471418}, {"type": "f1", "value": 82.31935226516424}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringP2P", "type": "mteb/medrxiv-clustering-p2p", "config": "default", "split": "test", "revision": "e7a26af6f3ae46b30dde8737f02c07b1505bcc73"}, "metrics": [{"type": "v_measure", "value": 32.733209733023}, {"type": "v_measures", "value": [0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792, 0.3268102022520237, 0.30894802212942296, 0.3267412500148118, 0.3083054819872514, 0.31284256226804597, 0.33514297956992917, 0.3297363893986241, 0.34536511251773544, 0.3498041803334763, 0.3296247928309792]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringS2S", "type": "mteb/medrxiv-clustering-s2s", "config": "default", "split": "test", "revision": "35191c8c0dca72d8ff3efcd72aa802307d469663"}, "metrics": [{"type": "v_measure", "value": 32.325298069936835}, {"type": "v_measures", "value": [0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757, 0.3150861010517341, 0.32155893979978684, 0.3177517595474066, 0.30022420485295037, 0.3197693379138355, 0.33188657891678974, 0.3386256684441414, 0.3305481006752689, 0.3436443696432427, 0.31343474614852757]}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB MindSmallReranking", "type": "mteb/mind_small", "config": "default", "split": "test", "revision": "3bdac13927fdc888b903db93b2ffdbd90b295a69"}, "metrics": [{"type": "map", "value": 32.511595472837335}, {"type": "mrr", "value": 33.73044905745997}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NFCorpus", "type": "mteb/nfcorpus", "config": "default", "split": "test", "revision": "ec0fa4fe99da2ff19ca1214b7966684033a58814"}, "metrics": [{"type": "map_at_1", "value": 7.0760000000000005}, {"type": "map_at_10", "value": 16.039}, {"type": "map_at_100", "value": 16.039}, {"type": "map_at_1000", "value": 16.039}, {"type": "map_at_20", "value": 16.039}, {"type": "map_at_3", "value": 11.408}, {"type": "map_at_5", "value": 13.547}, {"type": "mrr_at_1", "value": 53.559999999999995}, {"type": "mrr_at_10", "value": 61.531000000000006}, {"type": "mrr_at_100", "value": 61.531000000000006}, {"type": "mrr_at_1000", "value": 61.531000000000006}, {"type": "mrr_at_20", "value": 61.531000000000006}, {"type": "mrr_at_3", "value": 59.236}, {"type": "mrr_at_5", "value": 60.49}, {"type": "ndcg_at_1", "value": 51.083999999999996}, {"type": "ndcg_at_10", "value": 41.332}, {"type": "ndcg_at_100", "value": 27.083000000000002}, {"type": "ndcg_at_1000", "value": 26.619}, {"type": "ndcg_at_20", "value": 33.188}, {"type": "ndcg_at_3", "value": 46.605999999999995}, {"type": "ndcg_at_5", "value": 44.362}, {"type": "precision_at_1", "value": 52.941}, {"type": "precision_at_10", "value": 30.65}, {"type": "precision_at_100", "value": 3.065}, {"type": "precision_at_1000", "value": 0.307}, {"type": "precision_at_20", "value": 15.325}, {"type": "precision_at_3", "value": 43.447}, {"type": "precision_at_5", "value": 38.266}, {"type": "recall_at_1", "value": 7.0760000000000005}, {"type": "recall_at_10", "value": 20.929000000000002}, {"type": "recall_at_100", "value": 20.929000000000002}, {"type": "recall_at_1000", "value": 20.929000000000002}, {"type": "recall_at_20", "value": 20.929000000000002}, {"type": "recall_at_3", "value": 12.601}, {"type": "recall_at_5", "value": 15.955}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NQ", "type": "mteb/nq", "config": "default", "split": "test", "revision": "b774495ed302d8c44a3a7ea25c90dbce03968f31"}, "metrics": [{"type": "map_at_1", "value": 39.204}, {"type": "map_at_10", "value": 56.808}, {"type": "map_at_100", "value": 56.808}, {"type": "map_at_1000", "value": 56.808}, {"type": "map_at_20", "value": 56.808}, {"type": "map_at_3", "value": 52.471999999999994}, {"type": "map_at_5", "value": 55.191}, {"type": "mrr_at_1", "value": 44.032}, {"type": "mrr_at_10", "value": 59.158}, {"type": "mrr_at_100", "value": 59.158}, {"type": "mrr_at_1000", "value": 59.158}, {"type": "mrr_at_20", "value": 59.158}, {"type": "mrr_at_3", "value": 55.948}, {"type": "mrr_at_5", "value": 57.96}, {"type": "ndcg_at_1", "value": 44.032}, {"type": "ndcg_at_10", "value": 64.672}, {"type": "ndcg_at_100", "value": 64.672}, {"type": "ndcg_at_1000", "value": 64.672}, {"type": "ndcg_at_20", "value": 64.672}, {"type": "ndcg_at_3", "value": 56.955999999999996}, {"type": "ndcg_at_5", "value": 61.278999999999996}, {"type": "precision_at_1", "value": 44.032}, {"type": "precision_at_10", "value": 10.295}, {"type": "precision_at_100", "value": 1.03}, {"type": "precision_at_1000", "value": 0.10300000000000001}, {"type": "precision_at_20", "value": 5.148}, {"type": "precision_at_3", "value": 25.83}, {"type": "precision_at_5", "value": 18.053}, {"type": "recall_at_1", "value": 39.204}, {"type": "recall_at_10", "value": 85.936}, {"type": "recall_at_100", "value": 85.936}, {"type": "recall_at_1000", "value": 85.936}, {"type": "recall_at_20", "value": 85.936}, {"type": "recall_at_3", "value": 66.387}, {"type": "recall_at_5", "value": 76.238}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB QuoraRetrieval", "type": "mteb/quora", "config": "default", "split": "test", "revision": "e4e08e0b7dbe3c8700f0daef558ff32256715259"}, "metrics": [{"type": "map_at_1", "value": 71.068}, {"type": "map_at_10", "value": 85.271}, {"type": "map_at_100", "value": 85.271}, {"type": "map_at_1000", "value": 85.271}, {"type": "map_at_20", "value": 85.271}, {"type": "map_at_3", "value": 82.23899999999999}, {"type": "map_at_5", "value": 84.165}, {"type": "mrr_at_1", "value": 81.85}, {"type": "mrr_at_10", "value": 87.856}, {"type": "mrr_at_100", "value": 87.856}, {"type": "mrr_at_1000", "value": 87.856}, {"type": "mrr_at_20", "value": 87.856}, {"type": "mrr_at_3", "value": 86.925}, {"type": "mrr_at_5", "value": 87.559}, {"type": "ndcg_at_1", "value": 81.89}, {"type": "ndcg_at_10", "value": 88.856}, {"type": "ndcg_at_100", "value": 88.723}, {"type": "ndcg_at_1000", "value": 88.723}, {"type": "ndcg_at_20", "value": 88.74300000000001}, {"type": "ndcg_at_3", "value": 86.05199999999999}, {"type": "ndcg_at_5", "value": 87.61}, {"type": "precision_at_1", "value": 81.89}, {"type": "precision_at_10", "value": 13.569999999999999}, {"type": "precision_at_100", "value": 1.357}, {"type": "precision_at_1000", "value": 0.136}, {"type": "precision_at_20", "value": 6.784999999999999}, {"type": "precision_at_3", "value": 37.807}, {"type": "precision_at_5", "value": 24.908}, {"type": "recall_at_1", "value": 71.068}, {"type": "recall_at_10", "value": 95.797}, {"type": "recall_at_100", "value": 95.797}, {"type": "recall_at_1000", "value": 95.797}, {"type": "recall_at_20", "value": 95.797}, {"type": "recall_at_3", "value": 87.65899999999999}, {"type": "recall_at_5", "value": 92.107}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClustering", "type": "mteb/reddit-clustering", "config": "default", "split": "test", "revision": "24640382cdbf8abc73003fb0fa6d111a705499eb"}, "metrics": [{"type": "v_measure", "value": 62.16385792305745}, {"type": "v_measures", "value": [0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949, 0.587541495063312, 0.6775885692888917, 0.567537304495857, 0.6678131151900565, 0.5979158867861365, 0.5956669142507505, 0.6200557398628851, 0.5946598821061599, 0.603673972233609, 0.6050659113737895, 0.5742475015975338, 0.6273500769309232, 0.6526752602522112, 0.6416095306029318, 0.7334431385812594, 0.5847584715164077, 0.62727067061333, 0.6138592437270369, 0.6280966889397946, 0.5740785434257363, 0.5888905363406383, 0.6073133172766488, 0.7167239234204423, 0.6595740709329266, 0.5935547159550949]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClusteringP2P", "type": "mteb/reddit-clustering-p2p", "config": "default", "split": "test", "revision": "385e3cb46b4cfa89021f56c4380204149d0efe33"}, "metrics": [{"type": "v_measure", "value": 65.96296778394698}, {"type": "v_measures", "value": [0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593, 0.6994519018160104, 0.6685310786302858, 0.6560344637869603, 0.4053977970605211, 0.7423583342619767, 0.6657872853200192, 0.4487425514322897, 0.7791528368405061, 0.7406529421724692, 0.7901875870736593]}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SCIDOCS", "type": "mteb/scidocs", "config": "default", "split": "test", "revision": "f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88"}, "metrics": [{"type": "map_at_1", "value": 5.433000000000001}, {"type": "map_at_10", "value": 13.991000000000001}, {"type": "map_at_100", "value": 13.991000000000001}, {"type": "map_at_1000", "value": 13.991000000000001}, {"type": "map_at_20", "value": 13.991000000000001}, {"type": "map_at_3", "value": 9.708}, {"type": "map_at_5", "value": 11.849}, {"type": "mrr_at_1", "value": 26.8}, {"type": "mrr_at_10", "value": 38.012}, {"type": "mrr_at_100", "value": 38.012}, {"type": "mrr_at_1000", "value": 38.012}, {"type": "mrr_at_20", "value": 38.012}, {"type": "mrr_at_3", "value": 34.449999999999996}, {"type": "mrr_at_5", "value": 36.59}, {"type": "ndcg_at_1", "value": 26.8}, {"type": "ndcg_at_10", "value": 23.006999999999998}, {"type": "ndcg_at_100", "value": 23.006999999999998}, {"type": "ndcg_at_1000", "value": 23.006999999999998}, {"type": "ndcg_at_20", "value": 23.006999999999998}, {"type": "ndcg_at_3", "value": 21.386}, {"type": "ndcg_at_5", "value": 19.046}, {"type": "precision_at_1", "value": 26.8}, {"type": "precision_at_10", "value": 12.01}, {"type": "precision_at_100", "value": 1.201}, {"type": "precision_at_1000", "value": 0.12}, {"type": "precision_at_20", "value": 6.005}, {"type": "precision_at_3", "value": 19.833000000000002}, {"type": "precision_at_5", "value": 16.84}, {"type": "recall_at_1", "value": 5.433000000000001}, {"type": "recall_at_10", "value": 24.34}, {"type": "recall_at_100", "value": 24.34}, {"type": "recall_at_1000", "value": 24.34}, {"type": "recall_at_20", "value": 24.34}, {"type": "recall_at_3", "value": 12.058}, {"type": "recall_at_5", "value": 17.058}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB SICK-R", "type": "mteb/sickr-sts", "config": "default", "split": "test", "revision": "20a6d6f312dd54037fe07a32d58e5e168867909d"}, "metrics": [{"type": "cos_sim_pearson", "value": 84.84178272773948}, {"type": "cos_sim_spearman", "value": 82.32746830315172}, {"type": "euclidean_pearson", "value": 82.11599650658388}, {"type": "euclidean_spearman", "value": 82.38102437050075}, {"type": "manhattan_pearson", "value": 82.07071847892156}, {"type": "manhattan_spearman", "value": 82.35710877093594}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS12", "type": "mteb/sts12-sts", "config": "default", "split": "test", "revision": "a0d554a64d88156834ff5ae9920b964011b16384"}, "metrics": [{"type": "cos_sim_pearson", "value": 86.86916828280668}, {"type": "cos_sim_spearman", "value": 79.69553254808825}, {"type": "euclidean_pearson", "value": 82.86582224049857}, {"type": "euclidean_spearman", "value": 79.1765897124049}, {"type": "manhattan_pearson", "value": 83.15978473993391}, {"type": "manhattan_spearman", "value": 79.54192003597332}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS13", "type": "mteb/sts13-sts", "config": "default", "split": "test", "revision": "7e90230a92c190f1bf69ae9002b8cea547a64cca"}, "metrics": [{"type": "cos_sim_pearson", "value": 88.7719804239987}, {"type": "cos_sim_spearman", "value": 89.20788765830103}, {"type": "euclidean_pearson", "value": 88.67624029627581}, {"type": "euclidean_spearman", "value": 89.15058058277351}, {"type": "manhattan_pearson", "value": 88.43477620818435}, {"type": "manhattan_spearman", "value": 89.01994285052193}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS14", "type": "mteb/sts14-sts", "config": "default", "split": "test", "revision": "6031580fec1f6af667f0bd2da0a551cf4f0b2375"}, "metrics": [{"type": "cos_sim_pearson", "value": 87.04733612348426}, {"type": "cos_sim_spearman", "value": 86.0120242985069}, {"type": "euclidean_pearson", "value": 86.07045247599824}, {"type": "euclidean_spearman", "value": 86.22185577032168}, {"type": "manhattan_pearson", "value": 85.79555943035328}, {"type": "manhattan_spearman", "value": 86.13821651705776}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS15", "type": "mteb/sts15-sts", "config": "default", "split": "test", "revision": "ae752c7c21bf194d8b67fd573edf7ae58183cbe3"}, "metrics": [{"type": "cos_sim_pearson", "value": 89.395594115739}, {"type": "cos_sim_spearman", "value": 89.70312809978681}, {"type": "euclidean_pearson", "value": 89.10137224981938}, {"type": "euclidean_spearman", "value": 89.74149793061072}, {"type": "manhattan_pearson", "value": 89.06144914118401}, {"type": "manhattan_spearman", "value": 89.78489015365638}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS16", "type": "mteb/sts16-sts", "config": "default", "split": "test", "revision": "4d8694f8f0e0100860b497b999b3dbed754a0513"}, "metrics": [{"type": "cos_sim_pearson", "value": 86.1720394205624}, {"type": "cos_sim_spearman", "value": 87.67900288178751}, {"type": "euclidean_pearson", "value": 86.73052291563968}, {"type": "euclidean_spearman", "value": 87.49116803671033}, {"type": "manhattan_pearson", "value": 86.79988999910331}, {"type": "manhattan_spearman", "value": 87.57540934207157}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS17 (en-en)", "type": "mteb/sts17-crosslingual-sts", "config": "en-en", "split": "test", "revision": "af5e6fb845001ecf41f4c1e033ce921939a2a68d"}, "metrics": [{"type": "cos_sim_pearson", "value": 88.75286004155564}, {"type": "cos_sim_spearman", "value": 88.03161515281518}, {"type": "euclidean_pearson", "value": 88.55464128719427}, {"type": "euclidean_spearman", "value": 87.78041200668837}, {"type": "manhattan_pearson", "value": 88.18469209314583}, {"type": "manhattan_spearman", "value": 87.31602253333598}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (en)", "type": "mteb/sts22-crosslingual-sts", "config": "en", "split": "test", "revision": "eea2b4fe26a775864c896887d910b76a8098ad3f"}, "metrics": [{"type": "cos_sim_pearson", "value": 70.48372140035973}, {"type": "cos_sim_spearman", "value": 70.16107814793419}, {"type": "euclidean_pearson", "value": 69.65789511103976}, {"type": "euclidean_spearman", "value": 68.92441073988654}, {"type": "manhattan_pearson", "value": 69.55306498752127}, {"type": "manhattan_spearman", "value": 68.82186378798527}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSBenchmark", "type": "mteb/stsbenchmark-sts", "config": "default", "split": "test", "revision": "b0fddb56ed78048fa8b90373c8a3cfc37b684831"}, "metrics": [{"type": "cos_sim_pearson", "value": 87.43017430741797}, {"type": "cos_sim_spearman", "value": 88.14675226940803}, {"type": "euclidean_pearson", "value": 87.33329490848514}, {"type": "euclidean_spearman", "value": 87.94164481397011}, {"type": "manhattan_pearson", "value": 87.19303598684772}, {"type": "manhattan_spearman", "value": 87.86899889639051}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB SciDocsRR", "type": "mteb/scidocs-reranking", "config": "default", "split": "test", "revision": "d3c5e1fc0b855ab6097bf1cda04dd73947d7caab"}, "metrics": [{"type": "map", "value": 87.03073019943413}, {"type": "mrr", "value": 96.67456726280255}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SciFact", "type": "mteb/scifact", "config": "default", "split": "test", "revision": "0228b52cf27578f30900b9e5271d331663a030d7"}, "metrics": [{"type": "map_at_1", "value": 64.328}, {"type": "map_at_10", "value": 75.046}, {"type": "map_at_100", "value": 75.046}, {"type": "map_at_1000", "value": 75.046}, {"type": "map_at_20", "value": 75.046}, {"type": "map_at_3", "value": 72.42}, {"type": "map_at_5", "value": 73.88900000000001}, {"type": "mrr_at_1", "value": 67.667}, {"type": "mrr_at_10", "value": 76.19200000000001}, {"type": "mrr_at_100", "value": 76.19200000000001}, {"type": "mrr_at_1000", "value": 76.19200000000001}, {"type": "mrr_at_20", "value": 76.19200000000001}, {"type": "mrr_at_3", "value": 74.556}, {"type": "mrr_at_5", "value": 75.372}, {"type": "ndcg_at_1", "value": 67.667}, {"type": "ndcg_at_10", "value": 79.621}, {"type": "ndcg_at_100", "value": 79.621}, {"type": "ndcg_at_1000", "value": 79.621}, {"type": "ndcg_at_20", "value": 79.621}, {"type": "ndcg_at_3", "value": 75.506}, {"type": "ndcg_at_5", "value": 77.269}, {"type": "precision_at_1", "value": 67.667}, {"type": "precision_at_10", "value": 10.467}, {"type": "precision_at_100", "value": 1.047}, {"type": "precision_at_1000", "value": 0.105}, {"type": "precision_at_20", "value": 5.2330000000000005}, {"type": "precision_at_3", "value": 29.444}, {"type": "precision_at_5", "value": 19.133}, {"type": "recall_at_1", "value": 64.328}, {"type": "recall_at_10", "value": 92.389}, {"type": "recall_at_100", "value": 92.389}, {"type": "recall_at_1000", "value": 92.389}, {"type": "recall_at_20", "value": 92.389}, {"type": "recall_at_3", "value": 81.183}, {"type": "recall_at_5", "value": 85.60600000000001}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB SprintDuplicateQuestions", "type": "mteb/sprintduplicatequestions-pairclassification", "config": "default", "split": "test", "revision": "d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46"}, "metrics": [{"type": "cos_sim_accuracy", "value": 99.83762376237624}, {"type": "cos_sim_ap", "value": 96.51580702723564}, {"type": "cos_sim_f1", "value": 91.63265306122449}, {"type": "cos_sim_precision", "value": 93.54166666666667}, {"type": "cos_sim_recall", "value": 89.8}, {"type": "dot_accuracy", "value": 99.73663366336633}, {"type": "dot_ap", "value": 93.5764284433306}, {"type": "dot_f1", "value": 86.56565656565655}, {"type": "dot_precision", "value": 87.44897959183675}, {"type": "dot_recall", "value": 85.7}, {"type": "euclidean_accuracy", "value": 99.84059405940594}, {"type": "euclidean_ap", "value": 96.4738308210008}, {"type": "euclidean_f1", "value": 91.76470588235294}, {"type": "euclidean_precision", "value": 93.92670157068062}, {"type": "euclidean_recall", "value": 89.7}, {"type": "manhattan_accuracy", "value": 99.84356435643565}, {"type": "manhattan_ap", "value": 96.58366196890644}, {"type": "manhattan_f1", "value": 91.93054136874362}, {"type": "manhattan_precision", "value": 93.94572025052193}, {"type": "manhattan_recall", "value": 90.0}, {"type": "max_accuracy", "value": 99.84356435643565}, {"type": "max_ap", "value": 96.58366196890644}, {"type": "max_f1", "value": 91.93054136874362}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClustering", "type": "mteb/stackexchange-clustering", "config": "default", "split": "test", "revision": "6cbc1f7b2bc0622f2e39d2c77fa502909748c259"}, "metrics": [{"type": "v_measure", "value": 71.3538865724681}, {"type": "v_measures", "value": [0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368, 0.7491029730754422, 0.6679041279132695, 0.6706416821131351, 0.7400759063631245, 0.7205282507088282, 0.7207474445915961, 0.7076023322461216, 0.7919544039062477, 0.697356950041273, 0.7155185617564064, 0.8170975778555782, 0.7758530037956233, 0.7557716341966847, 0.7418030161151182, 0.6544532124169519, 0.7116665112917787, 0.6779566961395338, 0.6721164638120183, 0.6901024025391699, 0.6457684359608986, 0.7074519871138994, 0.7296079088233842, 0.7023239980988409, 0.6900078050266639, 0.6850583572154368]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClusteringP2P", "type": "mteb/stackexchange-clustering-p2p", "config": "default", "split": "test", "revision": "815ca46b2622cec33ccafc3735d572c266efdb44"}, "metrics": [{"type": "v_measure", "value": 36.11009155563876}, {"type": "v_measures", "value": [0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684, 0.35242637090556483, 0.34198478937626525, 0.3480143704468013, 0.3432433824651389, 0.34581837944580823, 0.38852793624316134, 0.3664105091244259, 0.3798083138774721, 0.37268279094517115, 0.37209231273406684]}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB StackOverflowDupQuestions", "type": "mteb/stackoverflowdupquestions-reranking", "config": "default", "split": "test", "revision": "e185fbe320c72810689fc5848eb6114e1ef5ec69"}, "metrics": [{"type": "map", "value": 55.54551767207771}, {"type": "mrr", "value": 56.55926385705797}]}, {"task": {"type": "Summarization"}, "dataset": {"name": "MTEB SummEval", "type": "mteb/summeval", "config": "default", "split": "test", "revision": "cda12ad7615edc362dbf25a00fdd61d3b1eaf93c"}, "metrics": [{"type": "cos_sim_pearson", "value": 30.805678984951985}, {"type": "cos_sim_spearman", "value": 30.827574116605362}, {"type": "dot_pearson", "value": 29.899814768586204}, {"type": "dot_spearman", "value": 29.588760095881174}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB TRECCOVID", "type": "mteb/trec-covid", "config": "default", "split": "test", "revision": "bb9466bac8153a0349341eb1b22e06409e78ef4e"}, "metrics": [{"type": "map_at_1", "value": 0.22200000000000003}, {"type": "map_at_10", "value": 2.046}, {"type": "map_at_100", "value": 2.046}, {"type": "map_at_1000", "value": 2.046}, {"type": "map_at_20", "value": 2.046}, {"type": "map_at_3", "value": 0.661}, {"type": "map_at_5", "value": 1.057}, {"type": "mrr_at_1", "value": 84.0}, {"type": "mrr_at_10", "value": 91.333}, {"type": "mrr_at_100", "value": 91.333}, {"type": "mrr_at_1000", "value": 91.333}, {"type": "mrr_at_20", "value": 91.333}, {"type": "mrr_at_3", "value": 91.0}, {"type": "mrr_at_5", "value": 91.0}, {"type": "ndcg_at_1", "value": 80.0}, {"type": "ndcg_at_10", "value": 80.74900000000001}, {"type": "ndcg_at_100", "value": 17.761}, {"type": "ndcg_at_1000", "value": 7.5920000000000005}, {"type": "ndcg_at_20", "value": 52.113}, {"type": "ndcg_at_3", "value": 83.542}, {"type": "ndcg_at_5", "value": 82.151}, {"type": "precision_at_1", "value": 84.0}, {"type": "precision_at_10", "value": 84.6}, {"type": "precision_at_100", "value": 8.459999999999999}, {"type": "precision_at_1000", "value": 0.8460000000000001}, {"type": "precision_at_20", "value": 42.3}, {"type": "precision_at_3", "value": 88.0}, {"type": "precision_at_5", "value": 86.0}, {"type": "recall_at_1", "value": 0.22200000000000003}, {"type": "recall_at_10", "value": 2.235}, {"type": "recall_at_100", "value": 2.235}, {"type": "recall_at_1000", "value": 2.235}, {"type": "recall_at_20", "value": 2.235}, {"type": "recall_at_3", "value": 0.695}, {"type": "recall_at_5", "value": 1.121}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB Touche2020", "type": "mteb/touche2020", "config": "default", "split": "test", "revision": "a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f"}, "metrics": [{"type": "map_at_1", "value": 3.2750000000000004}, {"type": "map_at_10", "value": 10.514}, {"type": "map_at_100", "value": 10.514}, {"type": "map_at_1000", "value": 10.514}, {"type": "map_at_20", "value": 10.514}, {"type": "map_at_3", "value": 5.662}, {"type": "map_at_5", "value": 7.808}, {"type": "mrr_at_1", "value": 40.816}, {"type": "mrr_at_10", "value": 49.88}, {"type": "mrr_at_100", "value": 49.88}, {"type": "mrr_at_1000", "value": 49.88}, {"type": "mrr_at_20", "value": 49.88}, {"type": "mrr_at_3", "value": 46.259}, {"type": "mrr_at_5", "value": 47.585}, {"type": "ndcg_at_1", "value": 37.755}, {"type": "ndcg_at_10", "value": 25.237}, {"type": "ndcg_at_100", "value": 21.149}, {"type": "ndcg_at_1000", "value": 21.149}, {"type": "ndcg_at_20", "value": 21.401999999999997}, {"type": "ndcg_at_3", "value": 27.465}, {"type": "ndcg_at_5", "value": 26.169999999999998}, {"type": "precision_at_1", "value": 40.816}, {"type": "precision_at_10", "value": 21.224}, {"type": "precision_at_100", "value": 2.122}, {"type": "precision_at_1000", "value": 0.212}, {"type": "precision_at_20", "value": 10.612}, {"type": "precision_at_3", "value": 26.531}, {"type": "precision_at_5", "value": 24.490000000000002}, {"type": "recall_at_1", "value": 3.2750000000000004}, {"type": "recall_at_10", "value": 16.264}, {"type": "recall_at_100", "value": 16.264}, {"type": "recall_at_1000", "value": 16.264}, {"type": "recall_at_20", "value": 16.264}, {"type": "recall_at_3", "value": 6.265999999999999}, {"type": "recall_at_5", "value": 9.677}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ToxicConversationsClassification", "type": "mteb/toxic_conversations_50k", "config": "default", "split": "test", "revision": "edfaf9da55d3dd50d43143d90c1ac476895ae6de"}, "metrics": [{"type": "accuracy", "value": 66.181640625}, {"type": "ap", "value": 12.61343083198892}, {"type": "f1", "value": 51.12214559856414}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB TweetSentimentExtractionClassification", "type": "mteb/tweet_sentiment_extraction", "config": "default", "split": "test", "revision": "d604517c81ca91fe16a244d1248fc021f9ecee7a"}, "metrics": [{"type": "accuracy", "value": 62.543859649122815}, {"type": "f1", "value": 62.742315191046295}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB TwentyNewsgroupsClustering", "type": "mteb/twentynewsgroups-clustering", "config": "default", "split": "test", "revision": "6125ec4e24fa026cec8a478383ee943acfbd5449"}, "metrics": [{"type": "v_measure", "value": 54.7799424517948}, {"type": "v_measures", "value": [0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308, 0.550822270643678, 0.5550309505411892, 0.5374116804548088, 0.530806408291854, 0.5520216200733947, 0.5723223656123475, 0.5487505833189581, 0.5496668776225391, 0.5230606424471813, 0.5581008461735308]}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterSemEval2015", "type": "mteb/twittersemeval2015-pairclassification", "config": "default", "split": "test", "revision": "70970daeab8776df92f5ea462b6173c0b46fd2d1"}, "metrics": [{"type": "cos_sim_accuracy", "value": 88.24581271979496}, {"type": "cos_sim_ap", "value": 81.34631603712425}, {"type": "cos_sim_f1", "value": 73.6588459099556}, {"type": "cos_sim_precision", "value": 70.91575091575092}, {"type": "cos_sim_recall", "value": 76.62269129287598}, {"type": "dot_accuracy", "value": 86.33247898909221}, {"type": "dot_ap", "value": 74.8713850965631}, {"type": "dot_f1", "value": 69.68152866242038}, {"type": "dot_precision", "value": 67.36453201970444}, {"type": "dot_recall", "value": 72.16358839050132}, {"type": "euclidean_accuracy", "value": 88.37098408535495}, {"type": "euclidean_ap", "value": 81.3880827682646}, {"type": "euclidean_f1", "value": 73.69367056104764}, {"type": "euclidean_precision", "value": 71.76794198549638}, {"type": "euclidean_recall", "value": 75.72559366754618}, {"type": "manhattan_accuracy", "value": 88.28157596709781}, {"type": "manhattan_ap", "value": 81.11568493905267}, {"type": "manhattan_f1", "value": 73.38364779874215}, {"type": "manhattan_precision", "value": 70.1201923076923}, {"type": "manhattan_recall", "value": 76.96569920844327}, {"type": "max_accuracy", "value": 88.37098408535495}, {"type": "max_ap", "value": 81.3880827682646}, {"type": "max_f1", "value": 73.69367056104764}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterURLCorpus", "type": "mteb/twitterurlcorpus-pairclassification", "config": "default", "split": "test", "revision": "8b6510b0b1fa4e4c4f879467980e9be563ec1cdf"}, "metrics": [{"type": "cos_sim_accuracy", "value": 89.54476656188147}, {"type": "cos_sim_ap", "value": 86.93964282285746}, {"type": "cos_sim_f1", "value": 79.50401702190103}, {"type": "cos_sim_precision", "value": 75.93020811435778}, {"type": "cos_sim_recall", "value": 83.43085925469664}, {"type": "dot_accuracy", "value": 88.64050917840649}, {"type": "dot_ap", "value": 84.81007248888473}, {"type": "dot_f1", "value": 77.95706670508572}, {"type": "dot_precision", "value": 73.24038982133189}, {"type": "dot_recall", "value": 83.32306744687403}, {"type": "euclidean_accuracy", "value": 89.53894516241705}, {"type": "euclidean_ap", "value": 86.92299719471643}, {"type": "euclidean_f1", "value": 79.55922060862585}, {"type": "euclidean_precision", "value": 75.61381606325426}, {"type": "euclidean_recall", "value": 83.93902063443178}, {"type": "manhattan_accuracy", "value": 89.5234214305119}, {"type": "manhattan_ap", "value": 86.93261273512803}, {"type": "manhattan_f1", "value": 79.54703705061019}, {"type": "manhattan_precision", "value": 75.90041261626688}, {"type": "manhattan_recall", "value": 83.56174930705266}, {"type": "max_accuracy", "value": 89.54476656188147}, {"type": "max_ap", "value": 86.93964282285746}, {"type": "max_f1", "value": 79.55922060862585}]}]}]} |
Geolumina/instructor-xl | Geolumina | sentence-similarity | [
"sentence-transformers",
"pytorch",
"t5",
"text-embedding",
"embeddings",
"information-retrieval",
"beir",
"text-classification",
"language-model",
"text-clustering",
"text-semantic-similarity",
"text-evaluation",
"prompt-retrieval",
"text-reranking",
"feature-extraction",
"sentence-similarity",
"transformers",
"English",
"Sentence Similarity",
"natural_questions",
"ms_marco",
"fever",
"hotpot_qa",
"mteb",
"en",
"arxiv:2212.09741",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
]
| 2024-03-05T03:00:52 | 2025-02-05T20:03:17 | 9 | 1 | ---
language: en
license: apache-2.0
pipeline_tag: sentence-similarity
tags:
- text-embedding
- embeddings
- information-retrieval
- beir
- text-classification
- language-model
- text-clustering
- text-semantic-similarity
- text-evaluation
- prompt-retrieval
- text-reranking
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
- t5
- English
- Sentence Similarity
- natural_questions
- ms_marco
- fever
- hotpot_qa
- mteb
inference: false
model-index:
- name: final_xl_results
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 85.08955223880596
- type: ap
value: 52.66066378722476
- type: f1
value: 79.63340218960269
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 86.542
- type: ap
value: 81.92695193008987
- type: f1
value: 86.51466132573681
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 42.964
- type: f1
value: 41.43146249774862
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 29.872
- type: map_at_10
value: 46.342
- type: map_at_100
value: 47.152
- type: map_at_1000
value: 47.154
- type: map_at_3
value: 41.216
- type: map_at_5
value: 44.035999999999994
- type: mrr_at_1
value: 30.939
- type: mrr_at_10
value: 46.756
- type: mrr_at_100
value: 47.573
- type: mrr_at_1000
value: 47.575
- type: mrr_at_3
value: 41.548
- type: mrr_at_5
value: 44.425
- type: ndcg_at_1
value: 29.872
- type: ndcg_at_10
value: 55.65
- type: ndcg_at_100
value: 58.88099999999999
- type: ndcg_at_1000
value: 58.951
- type: ndcg_at_3
value: 45.0
- type: ndcg_at_5
value: 50.09
- type: precision_at_1
value: 29.872
- type: precision_at_10
value: 8.549
- type: precision_at_100
value: 0.991
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 18.658
- type: precision_at_5
value: 13.669999999999998
- type: recall_at_1
value: 29.872
- type: recall_at_10
value: 85.491
- type: recall_at_100
value: 99.075
- type: recall_at_1000
value: 99.644
- type: recall_at_3
value: 55.974000000000004
- type: recall_at_5
value: 68.35
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 42.452729850641276
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 32.21141846480423
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 65.34710928952622
- type: mrr
value: 77.61124301983028
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_spearman
value: 84.15312230525639
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 82.66233766233766
- type: f1
value: 82.04175284777669
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 37.36697339826455
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 30.551241447593092
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 36.797000000000004
- type: map_at_10
value: 48.46
- type: map_at_100
value: 49.968
- type: map_at_1000
value: 50.080000000000005
- type: map_at_3
value: 44.71
- type: map_at_5
value: 46.592
- type: mrr_at_1
value: 45.494
- type: mrr_at_10
value: 54.747
- type: mrr_at_100
value: 55.43599999999999
- type: mrr_at_1000
value: 55.464999999999996
- type: mrr_at_3
value: 52.361000000000004
- type: mrr_at_5
value: 53.727000000000004
- type: ndcg_at_1
value: 45.494
- type: ndcg_at_10
value: 54.989
- type: ndcg_at_100
value: 60.096000000000004
- type: ndcg_at_1000
value: 61.58
- type: ndcg_at_3
value: 49.977
- type: ndcg_at_5
value: 51.964999999999996
- type: precision_at_1
value: 45.494
- type: precision_at_10
value: 10.558
- type: precision_at_100
value: 1.6049999999999998
- type: precision_at_1000
value: 0.203
- type: precision_at_3
value: 23.796
- type: precision_at_5
value: 16.881
- type: recall_at_1
value: 36.797000000000004
- type: recall_at_10
value: 66.83
- type: recall_at_100
value: 88.34100000000001
- type: recall_at_1000
value: 97.202
- type: recall_at_3
value: 51.961999999999996
- type: recall_at_5
value: 57.940000000000005
- type: map_at_1
value: 32.597
- type: map_at_10
value: 43.424
- type: map_at_100
value: 44.78
- type: map_at_1000
value: 44.913
- type: map_at_3
value: 40.315
- type: map_at_5
value: 41.987
- type: mrr_at_1
value: 40.382
- type: mrr_at_10
value: 49.219
- type: mrr_at_100
value: 49.895
- type: mrr_at_1000
value: 49.936
- type: mrr_at_3
value: 46.996
- type: mrr_at_5
value: 48.231
- type: ndcg_at_1
value: 40.382
- type: ndcg_at_10
value: 49.318
- type: ndcg_at_100
value: 53.839999999999996
- type: ndcg_at_1000
value: 55.82899999999999
- type: ndcg_at_3
value: 44.914
- type: ndcg_at_5
value: 46.798
- type: precision_at_1
value: 40.382
- type: precision_at_10
value: 9.274000000000001
- type: precision_at_100
value: 1.497
- type: precision_at_1000
value: 0.198
- type: precision_at_3
value: 21.592
- type: precision_at_5
value: 15.159
- type: recall_at_1
value: 32.597
- type: recall_at_10
value: 59.882000000000005
- type: recall_at_100
value: 78.446
- type: recall_at_1000
value: 90.88000000000001
- type: recall_at_3
value: 46.9
- type: recall_at_5
value: 52.222
- type: map_at_1
value: 43.8
- type: map_at_10
value: 57.293000000000006
- type: map_at_100
value: 58.321
- type: map_at_1000
value: 58.361
- type: map_at_3
value: 53.839999999999996
- type: map_at_5
value: 55.838
- type: mrr_at_1
value: 49.592000000000006
- type: mrr_at_10
value: 60.643
- type: mrr_at_100
value: 61.23499999999999
- type: mrr_at_1000
value: 61.251999999999995
- type: mrr_at_3
value: 58.265
- type: mrr_at_5
value: 59.717
- type: ndcg_at_1
value: 49.592000000000006
- type: ndcg_at_10
value: 63.364
- type: ndcg_at_100
value: 67.167
- type: ndcg_at_1000
value: 67.867
- type: ndcg_at_3
value: 57.912
- type: ndcg_at_5
value: 60.697
- type: precision_at_1
value: 49.592000000000006
- type: precision_at_10
value: 10.088
- type: precision_at_100
value: 1.2930000000000001
- type: precision_at_1000
value: 0.13899999999999998
- type: precision_at_3
value: 25.789
- type: precision_at_5
value: 17.541999999999998
- type: recall_at_1
value: 43.8
- type: recall_at_10
value: 77.635
- type: recall_at_100
value: 93.748
- type: recall_at_1000
value: 98.468
- type: recall_at_3
value: 63.223
- type: recall_at_5
value: 70.122
- type: map_at_1
value: 27.721
- type: map_at_10
value: 35.626999999999995
- type: map_at_100
value: 36.719
- type: map_at_1000
value: 36.8
- type: map_at_3
value: 32.781
- type: map_at_5
value: 34.333999999999996
- type: mrr_at_1
value: 29.604999999999997
- type: mrr_at_10
value: 37.564
- type: mrr_at_100
value: 38.505
- type: mrr_at_1000
value: 38.565
- type: mrr_at_3
value: 34.727000000000004
- type: mrr_at_5
value: 36.207
- type: ndcg_at_1
value: 29.604999999999997
- type: ndcg_at_10
value: 40.575
- type: ndcg_at_100
value: 45.613
- type: ndcg_at_1000
value: 47.676
- type: ndcg_at_3
value: 34.811
- type: ndcg_at_5
value: 37.491
- type: precision_at_1
value: 29.604999999999997
- type: precision_at_10
value: 6.1690000000000005
- type: precision_at_100
value: 0.906
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 14.237
- type: precision_at_5
value: 10.056
- type: recall_at_1
value: 27.721
- type: recall_at_10
value: 54.041
- type: recall_at_100
value: 76.62299999999999
- type: recall_at_1000
value: 92.134
- type: recall_at_3
value: 38.582
- type: recall_at_5
value: 44.989000000000004
- type: map_at_1
value: 16.553
- type: map_at_10
value: 25.384
- type: map_at_100
value: 26.655
- type: map_at_1000
value: 26.778000000000002
- type: map_at_3
value: 22.733
- type: map_at_5
value: 24.119
- type: mrr_at_1
value: 20.149
- type: mrr_at_10
value: 29.705
- type: mrr_at_100
value: 30.672
- type: mrr_at_1000
value: 30.737
- type: mrr_at_3
value: 27.032
- type: mrr_at_5
value: 28.369
- type: ndcg_at_1
value: 20.149
- type: ndcg_at_10
value: 30.843999999999998
- type: ndcg_at_100
value: 36.716
- type: ndcg_at_1000
value: 39.495000000000005
- type: ndcg_at_3
value: 25.918999999999997
- type: ndcg_at_5
value: 27.992
- type: precision_at_1
value: 20.149
- type: precision_at_10
value: 5.858
- type: precision_at_100
value: 1.009
- type: precision_at_1000
value: 0.13799999999999998
- type: precision_at_3
value: 12.645000000000001
- type: precision_at_5
value: 9.179
- type: recall_at_1
value: 16.553
- type: recall_at_10
value: 43.136
- type: recall_at_100
value: 68.562
- type: recall_at_1000
value: 88.208
- type: recall_at_3
value: 29.493000000000002
- type: recall_at_5
value: 34.751
- type: map_at_1
value: 28.000999999999998
- type: map_at_10
value: 39.004
- type: map_at_100
value: 40.461999999999996
- type: map_at_1000
value: 40.566
- type: map_at_3
value: 35.805
- type: map_at_5
value: 37.672
- type: mrr_at_1
value: 33.782000000000004
- type: mrr_at_10
value: 44.702
- type: mrr_at_100
value: 45.528
- type: mrr_at_1000
value: 45.576
- type: mrr_at_3
value: 42.14
- type: mrr_at_5
value: 43.651
- type: ndcg_at_1
value: 33.782000000000004
- type: ndcg_at_10
value: 45.275999999999996
- type: ndcg_at_100
value: 50.888
- type: ndcg_at_1000
value: 52.879
- type: ndcg_at_3
value: 40.191
- type: ndcg_at_5
value: 42.731
- type: precision_at_1
value: 33.782000000000004
- type: precision_at_10
value: 8.200000000000001
- type: precision_at_100
value: 1.287
- type: precision_at_1000
value: 0.16199999999999998
- type: precision_at_3
value: 19.185
- type: precision_at_5
value: 13.667000000000002
- type: recall_at_1
value: 28.000999999999998
- type: recall_at_10
value: 58.131
- type: recall_at_100
value: 80.869
- type: recall_at_1000
value: 93.931
- type: recall_at_3
value: 44.161
- type: recall_at_5
value: 50.592000000000006
- type: map_at_1
value: 28.047
- type: map_at_10
value: 38.596000000000004
- type: map_at_100
value: 40.116
- type: map_at_1000
value: 40.232
- type: map_at_3
value: 35.205
- type: map_at_5
value: 37.076
- type: mrr_at_1
value: 34.932
- type: mrr_at_10
value: 44.496
- type: mrr_at_100
value: 45.47
- type: mrr_at_1000
value: 45.519999999999996
- type: mrr_at_3
value: 41.743
- type: mrr_at_5
value: 43.352000000000004
- type: ndcg_at_1
value: 34.932
- type: ndcg_at_10
value: 44.901
- type: ndcg_at_100
value: 50.788999999999994
- type: ndcg_at_1000
value: 52.867
- type: ndcg_at_3
value: 39.449
- type: ndcg_at_5
value: 41.929
- type: precision_at_1
value: 34.932
- type: precision_at_10
value: 8.311
- type: precision_at_100
value: 1.3050000000000002
- type: precision_at_1000
value: 0.166
- type: precision_at_3
value: 18.836
- type: precision_at_5
value: 13.447000000000001
- type: recall_at_1
value: 28.047
- type: recall_at_10
value: 57.717
- type: recall_at_100
value: 82.182
- type: recall_at_1000
value: 95.82000000000001
- type: recall_at_3
value: 42.448
- type: recall_at_5
value: 49.071
- type: map_at_1
value: 27.861250000000005
- type: map_at_10
value: 37.529583333333335
- type: map_at_100
value: 38.7915
- type: map_at_1000
value: 38.90558333333335
- type: map_at_3
value: 34.57333333333333
- type: map_at_5
value: 36.187166666666656
- type: mrr_at_1
value: 32.88291666666666
- type: mrr_at_10
value: 41.79750000000001
- type: mrr_at_100
value: 42.63183333333333
- type: mrr_at_1000
value: 42.68483333333333
- type: mrr_at_3
value: 39.313750000000006
- type: mrr_at_5
value: 40.70483333333333
- type: ndcg_at_1
value: 32.88291666666666
- type: ndcg_at_10
value: 43.09408333333333
- type: ndcg_at_100
value: 48.22158333333333
- type: ndcg_at_1000
value: 50.358000000000004
- type: ndcg_at_3
value: 38.129583333333336
- type: ndcg_at_5
value: 40.39266666666666
- type: precision_at_1
value: 32.88291666666666
- type: precision_at_10
value: 7.5584999999999996
- type: precision_at_100
value: 1.1903333333333332
- type: precision_at_1000
value: 0.15658333333333332
- type: precision_at_3
value: 17.495916666666666
- type: precision_at_5
value: 12.373833333333332
- type: recall_at_1
value: 27.861250000000005
- type: recall_at_10
value: 55.215916666666665
- type: recall_at_100
value: 77.392
- type: recall_at_1000
value: 92.04908333333334
- type: recall_at_3
value: 41.37475
- type: recall_at_5
value: 47.22908333333333
- type: map_at_1
value: 25.064999999999998
- type: map_at_10
value: 31.635999999999996
- type: map_at_100
value: 32.596000000000004
- type: map_at_1000
value: 32.695
- type: map_at_3
value: 29.612
- type: map_at_5
value: 30.768
- type: mrr_at_1
value: 28.528
- type: mrr_at_10
value: 34.717
- type: mrr_at_100
value: 35.558
- type: mrr_at_1000
value: 35.626000000000005
- type: mrr_at_3
value: 32.745000000000005
- type: mrr_at_5
value: 33.819
- type: ndcg_at_1
value: 28.528
- type: ndcg_at_10
value: 35.647
- type: ndcg_at_100
value: 40.207
- type: ndcg_at_1000
value: 42.695
- type: ndcg_at_3
value: 31.878
- type: ndcg_at_5
value: 33.634
- type: precision_at_1
value: 28.528
- type: precision_at_10
value: 5.46
- type: precision_at_100
value: 0.84
- type: precision_at_1000
value: 0.11399999999999999
- type: precision_at_3
value: 13.547999999999998
- type: precision_at_5
value: 9.325
- type: recall_at_1
value: 25.064999999999998
- type: recall_at_10
value: 45.096000000000004
- type: recall_at_100
value: 65.658
- type: recall_at_1000
value: 84.128
- type: recall_at_3
value: 34.337
- type: recall_at_5
value: 38.849000000000004
- type: map_at_1
value: 17.276
- type: map_at_10
value: 24.535
- type: map_at_100
value: 25.655
- type: map_at_1000
value: 25.782
- type: map_at_3
value: 22.228
- type: map_at_5
value: 23.612
- type: mrr_at_1
value: 21.266
- type: mrr_at_10
value: 28.474
- type: mrr_at_100
value: 29.398000000000003
- type: mrr_at_1000
value: 29.482000000000003
- type: mrr_at_3
value: 26.245
- type: mrr_at_5
value: 27.624
- type: ndcg_at_1
value: 21.266
- type: ndcg_at_10
value: 29.087000000000003
- type: ndcg_at_100
value: 34.374
- type: ndcg_at_1000
value: 37.433
- type: ndcg_at_3
value: 25.040000000000003
- type: ndcg_at_5
value: 27.116
- type: precision_at_1
value: 21.266
- type: precision_at_10
value: 5.258
- type: precision_at_100
value: 0.9299999999999999
- type: precision_at_1000
value: 0.13699999999999998
- type: precision_at_3
value: 11.849
- type: precision_at_5
value: 8.699
- type: recall_at_1
value: 17.276
- type: recall_at_10
value: 38.928000000000004
- type: recall_at_100
value: 62.529
- type: recall_at_1000
value: 84.44800000000001
- type: recall_at_3
value: 27.554000000000002
- type: recall_at_5
value: 32.915
- type: map_at_1
value: 27.297
- type: map_at_10
value: 36.957
- type: map_at_100
value: 38.252
- type: map_at_1000
value: 38.356
- type: map_at_3
value: 34.121
- type: map_at_5
value: 35.782000000000004
- type: mrr_at_1
value: 32.275999999999996
- type: mrr_at_10
value: 41.198
- type: mrr_at_100
value: 42.131
- type: mrr_at_1000
value: 42.186
- type: mrr_at_3
value: 38.557
- type: mrr_at_5
value: 40.12
- type: ndcg_at_1
value: 32.275999999999996
- type: ndcg_at_10
value: 42.516
- type: ndcg_at_100
value: 48.15
- type: ndcg_at_1000
value: 50.344
- type: ndcg_at_3
value: 37.423
- type: ndcg_at_5
value: 39.919
- type: precision_at_1
value: 32.275999999999996
- type: precision_at_10
value: 7.155
- type: precision_at_100
value: 1.123
- type: precision_at_1000
value: 0.14200000000000002
- type: precision_at_3
value: 17.163999999999998
- type: precision_at_5
value: 12.127
- type: recall_at_1
value: 27.297
- type: recall_at_10
value: 55.238
- type: recall_at_100
value: 79.2
- type: recall_at_1000
value: 94.258
- type: recall_at_3
value: 41.327000000000005
- type: recall_at_5
value: 47.588
- type: map_at_1
value: 29.142000000000003
- type: map_at_10
value: 38.769
- type: map_at_100
value: 40.292
- type: map_at_1000
value: 40.510000000000005
- type: map_at_3
value: 35.39
- type: map_at_5
value: 37.009
- type: mrr_at_1
value: 34.19
- type: mrr_at_10
value: 43.418
- type: mrr_at_100
value: 44.132
- type: mrr_at_1000
value: 44.175
- type: mrr_at_3
value: 40.547
- type: mrr_at_5
value: 42.088
- type: ndcg_at_1
value: 34.19
- type: ndcg_at_10
value: 45.14
- type: ndcg_at_100
value: 50.364
- type: ndcg_at_1000
value: 52.481
- type: ndcg_at_3
value: 39.466
- type: ndcg_at_5
value: 41.772
- type: precision_at_1
value: 34.19
- type: precision_at_10
value: 8.715
- type: precision_at_100
value: 1.6150000000000002
- type: precision_at_1000
value: 0.247
- type: precision_at_3
value: 18.248
- type: precision_at_5
value: 13.161999999999999
- type: recall_at_1
value: 29.142000000000003
- type: recall_at_10
value: 57.577999999999996
- type: recall_at_100
value: 81.428
- type: recall_at_1000
value: 94.017
- type: recall_at_3
value: 41.402
- type: recall_at_5
value: 47.695
- type: map_at_1
value: 22.039
- type: map_at_10
value: 30.669999999999998
- type: map_at_100
value: 31.682
- type: map_at_1000
value: 31.794
- type: map_at_3
value: 28.139999999999997
- type: map_at_5
value: 29.457
- type: mrr_at_1
value: 24.399
- type: mrr_at_10
value: 32.687
- type: mrr_at_100
value: 33.622
- type: mrr_at_1000
value: 33.698
- type: mrr_at_3
value: 30.407
- type: mrr_at_5
value: 31.552999999999997
- type: ndcg_at_1
value: 24.399
- type: ndcg_at_10
value: 35.472
- type: ndcg_at_100
value: 40.455000000000005
- type: ndcg_at_1000
value: 43.15
- type: ndcg_at_3
value: 30.575000000000003
- type: ndcg_at_5
value: 32.668
- type: precision_at_1
value: 24.399
- type: precision_at_10
value: 5.656
- type: precision_at_100
value: 0.874
- type: precision_at_1000
value: 0.121
- type: precision_at_3
value: 13.062000000000001
- type: precision_at_5
value: 9.242
- type: recall_at_1
value: 22.039
- type: recall_at_10
value: 48.379
- type: recall_at_100
value: 71.11800000000001
- type: recall_at_1000
value: 91.095
- type: recall_at_3
value: 35.108
- type: recall_at_5
value: 40.015
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 10.144
- type: map_at_10
value: 18.238
- type: map_at_100
value: 20.143
- type: map_at_1000
value: 20.346
- type: map_at_3
value: 14.809
- type: map_at_5
value: 16.567999999999998
- type: mrr_at_1
value: 22.671
- type: mrr_at_10
value: 34.906
- type: mrr_at_100
value: 35.858000000000004
- type: mrr_at_1000
value: 35.898
- type: mrr_at_3
value: 31.238
- type: mrr_at_5
value: 33.342
- type: ndcg_at_1
value: 22.671
- type: ndcg_at_10
value: 26.540000000000003
- type: ndcg_at_100
value: 34.138000000000005
- type: ndcg_at_1000
value: 37.72
- type: ndcg_at_3
value: 20.766000000000002
- type: ndcg_at_5
value: 22.927
- type: precision_at_1
value: 22.671
- type: precision_at_10
value: 8.619
- type: precision_at_100
value: 1.678
- type: precision_at_1000
value: 0.23500000000000001
- type: precision_at_3
value: 15.592
- type: precision_at_5
value: 12.43
- type: recall_at_1
value: 10.144
- type: recall_at_10
value: 33.46
- type: recall_at_100
value: 59.758
- type: recall_at_1000
value: 79.704
- type: recall_at_3
value: 19.604
- type: recall_at_5
value: 25.367
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 8.654
- type: map_at_10
value: 18.506
- type: map_at_100
value: 26.412999999999997
- type: map_at_1000
value: 28.13
- type: map_at_3
value: 13.379
- type: map_at_5
value: 15.529000000000002
- type: mrr_at_1
value: 66.0
- type: mrr_at_10
value: 74.13
- type: mrr_at_100
value: 74.48700000000001
- type: mrr_at_1000
value: 74.49799999999999
- type: mrr_at_3
value: 72.75
- type: mrr_at_5
value: 73.762
- type: ndcg_at_1
value: 54.50000000000001
- type: ndcg_at_10
value: 40.236
- type: ndcg_at_100
value: 44.690999999999995
- type: ndcg_at_1000
value: 52.195
- type: ndcg_at_3
value: 45.632
- type: ndcg_at_5
value: 42.952
- type: precision_at_1
value: 66.0
- type: precision_at_10
value: 31.724999999999998
- type: precision_at_100
value: 10.299999999999999
- type: precision_at_1000
value: 2.194
- type: precision_at_3
value: 48.75
- type: precision_at_5
value: 41.6
- type: recall_at_1
value: 8.654
- type: recall_at_10
value: 23.74
- type: recall_at_100
value: 50.346999999999994
- type: recall_at_1000
value: 74.376
- type: recall_at_3
value: 14.636
- type: recall_at_5
value: 18.009
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 53.245
- type: f1
value: 48.74520523753552
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 51.729
- type: map_at_10
value: 63.904
- type: map_at_100
value: 64.363
- type: map_at_1000
value: 64.38199999999999
- type: map_at_3
value: 61.393
- type: map_at_5
value: 63.02100000000001
- type: mrr_at_1
value: 55.686
- type: mrr_at_10
value: 67.804
- type: mrr_at_100
value: 68.15299999999999
- type: mrr_at_1000
value: 68.161
- type: mrr_at_3
value: 65.494
- type: mrr_at_5
value: 67.01599999999999
- type: ndcg_at_1
value: 55.686
- type: ndcg_at_10
value: 70.025
- type: ndcg_at_100
value: 72.011
- type: ndcg_at_1000
value: 72.443
- type: ndcg_at_3
value: 65.32900000000001
- type: ndcg_at_5
value: 68.05600000000001
- type: precision_at_1
value: 55.686
- type: precision_at_10
value: 9.358
- type: precision_at_100
value: 1.05
- type: precision_at_1000
value: 0.11
- type: precision_at_3
value: 26.318
- type: precision_at_5
value: 17.321
- type: recall_at_1
value: 51.729
- type: recall_at_10
value: 85.04
- type: recall_at_100
value: 93.777
- type: recall_at_1000
value: 96.824
- type: recall_at_3
value: 72.521
- type: recall_at_5
value: 79.148
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 23.765
- type: map_at_10
value: 39.114
- type: map_at_100
value: 40.987
- type: map_at_1000
value: 41.155
- type: map_at_3
value: 34.028000000000006
- type: map_at_5
value: 36.925000000000004
- type: mrr_at_1
value: 46.451
- type: mrr_at_10
value: 54.711
- type: mrr_at_100
value: 55.509
- type: mrr_at_1000
value: 55.535000000000004
- type: mrr_at_3
value: 52.649
- type: mrr_at_5
value: 53.729000000000006
- type: ndcg_at_1
value: 46.451
- type: ndcg_at_10
value: 46.955999999999996
- type: ndcg_at_100
value: 53.686
- type: ndcg_at_1000
value: 56.230000000000004
- type: ndcg_at_3
value: 43.374
- type: ndcg_at_5
value: 44.372
- type: precision_at_1
value: 46.451
- type: precision_at_10
value: 13.256
- type: precision_at_100
value: 2.019
- type: precision_at_1000
value: 0.247
- type: precision_at_3
value: 29.115000000000002
- type: precision_at_5
value: 21.389
- type: recall_at_1
value: 23.765
- type: recall_at_10
value: 53.452999999999996
- type: recall_at_100
value: 78.828
- type: recall_at_1000
value: 93.938
- type: recall_at_3
value: 39.023
- type: recall_at_5
value: 45.18
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 31.918000000000003
- type: map_at_10
value: 46.741
- type: map_at_100
value: 47.762
- type: map_at_1000
value: 47.849000000000004
- type: map_at_3
value: 43.578
- type: map_at_5
value: 45.395
- type: mrr_at_1
value: 63.834999999999994
- type: mrr_at_10
value: 71.312
- type: mrr_at_100
value: 71.695
- type: mrr_at_1000
value: 71.714
- type: mrr_at_3
value: 69.82000000000001
- type: mrr_at_5
value: 70.726
- type: ndcg_at_1
value: 63.834999999999994
- type: ndcg_at_10
value: 55.879999999999995
- type: ndcg_at_100
value: 59.723000000000006
- type: ndcg_at_1000
value: 61.49400000000001
- type: ndcg_at_3
value: 50.964
- type: ndcg_at_5
value: 53.47
- type: precision_at_1
value: 63.834999999999994
- type: precision_at_10
value: 11.845
- type: precision_at_100
value: 1.4869999999999999
- type: precision_at_1000
value: 0.172
- type: precision_at_3
value: 32.158
- type: precision_at_5
value: 21.278
- type: recall_at_1
value: 31.918000000000003
- type: recall_at_10
value: 59.223000000000006
- type: recall_at_100
value: 74.328
- type: recall_at_1000
value: 86.05000000000001
- type: recall_at_3
value: 48.238
- type: recall_at_5
value: 53.193999999999996
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 79.7896
- type: ap
value: 73.65166029460288
- type: f1
value: 79.71794693711813
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 22.239
- type: map_at_10
value: 34.542
- type: map_at_100
value: 35.717999999999996
- type: map_at_1000
value: 35.764
- type: map_at_3
value: 30.432
- type: map_at_5
value: 32.81
- type: mrr_at_1
value: 22.908
- type: mrr_at_10
value: 35.127
- type: mrr_at_100
value: 36.238
- type: mrr_at_1000
value: 36.278
- type: mrr_at_3
value: 31.076999999999998
- type: mrr_at_5
value: 33.419
- type: ndcg_at_1
value: 22.908
- type: ndcg_at_10
value: 41.607
- type: ndcg_at_100
value: 47.28
- type: ndcg_at_1000
value: 48.414
- type: ndcg_at_3
value: 33.253
- type: ndcg_at_5
value: 37.486000000000004
- type: precision_at_1
value: 22.908
- type: precision_at_10
value: 6.645
- type: precision_at_100
value: 0.9490000000000001
- type: precision_at_1000
value: 0.105
- type: precision_at_3
value: 14.130999999999998
- type: precision_at_5
value: 10.616
- type: recall_at_1
value: 22.239
- type: recall_at_10
value: 63.42
- type: recall_at_100
value: 89.696
- type: recall_at_1000
value: 98.351
- type: recall_at_3
value: 40.77
- type: recall_at_5
value: 50.93
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 95.06839945280439
- type: f1
value: 94.74276398224072
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 72.25718194254446
- type: f1
value: 53.91164489161391
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 71.47948890383323
- type: f1
value: 69.98520247230257
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 76.46603900470748
- type: f1
value: 76.44111526065399
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 33.19106070798198
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 30.78772205248094
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 31.811231631488507
- type: mrr
value: 32.98200485378021
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 6.9
- type: map_at_10
value: 13.703000000000001
- type: map_at_100
value: 17.251
- type: map_at_1000
value: 18.795
- type: map_at_3
value: 10.366999999999999
- type: map_at_5
value: 11.675
- type: mrr_at_1
value: 47.059
- type: mrr_at_10
value: 55.816
- type: mrr_at_100
value: 56.434
- type: mrr_at_1000
value: 56.467
- type: mrr_at_3
value: 53.973000000000006
- type: mrr_at_5
value: 55.257999999999996
- type: ndcg_at_1
value: 44.737
- type: ndcg_at_10
value: 35.997
- type: ndcg_at_100
value: 33.487
- type: ndcg_at_1000
value: 41.897
- type: ndcg_at_3
value: 41.18
- type: ndcg_at_5
value: 38.721
- type: precision_at_1
value: 46.129999999999995
- type: precision_at_10
value: 26.533
- type: precision_at_100
value: 8.706
- type: precision_at_1000
value: 2.16
- type: precision_at_3
value: 38.493
- type: precision_at_5
value: 33.189
- type: recall_at_1
value: 6.9
- type: recall_at_10
value: 17.488999999999997
- type: recall_at_100
value: 34.583000000000006
- type: recall_at_1000
value: 64.942
- type: recall_at_3
value: 11.494
- type: recall_at_5
value: 13.496
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 33.028999999999996
- type: map_at_10
value: 49.307
- type: map_at_100
value: 50.205
- type: map_at_1000
value: 50.23
- type: map_at_3
value: 44.782
- type: map_at_5
value: 47.599999999999994
- type: mrr_at_1
value: 37.108999999999995
- type: mrr_at_10
value: 51.742999999999995
- type: mrr_at_100
value: 52.405
- type: mrr_at_1000
value: 52.422000000000004
- type: mrr_at_3
value: 48.087999999999994
- type: mrr_at_5
value: 50.414
- type: ndcg_at_1
value: 37.08
- type: ndcg_at_10
value: 57.236
- type: ndcg_at_100
value: 60.931999999999995
- type: ndcg_at_1000
value: 61.522
- type: ndcg_at_3
value: 48.93
- type: ndcg_at_5
value: 53.561
- type: precision_at_1
value: 37.08
- type: precision_at_10
value: 9.386
- type: precision_at_100
value: 1.1480000000000001
- type: precision_at_1000
value: 0.12
- type: precision_at_3
value: 22.258
- type: precision_at_5
value: 16.025
- type: recall_at_1
value: 33.028999999999996
- type: recall_at_10
value: 78.805
- type: recall_at_100
value: 94.643
- type: recall_at_1000
value: 99.039
- type: recall_at_3
value: 57.602
- type: recall_at_5
value: 68.253
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 71.122
- type: map_at_10
value: 85.237
- type: map_at_100
value: 85.872
- type: map_at_1000
value: 85.885
- type: map_at_3
value: 82.27499999999999
- type: map_at_5
value: 84.13199999999999
- type: mrr_at_1
value: 81.73
- type: mrr_at_10
value: 87.834
- type: mrr_at_100
value: 87.92
- type: mrr_at_1000
value: 87.921
- type: mrr_at_3
value: 86.878
- type: mrr_at_5
value: 87.512
- type: ndcg_at_1
value: 81.73
- type: ndcg_at_10
value: 88.85499999999999
- type: ndcg_at_100
value: 89.992
- type: ndcg_at_1000
value: 90.07
- type: ndcg_at_3
value: 85.997
- type: ndcg_at_5
value: 87.55199999999999
- type: precision_at_1
value: 81.73
- type: precision_at_10
value: 13.491
- type: precision_at_100
value: 1.536
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 37.623
- type: precision_at_5
value: 24.742
- type: recall_at_1
value: 71.122
- type: recall_at_10
value: 95.935
- type: recall_at_100
value: 99.657
- type: recall_at_1000
value: 99.996
- type: recall_at_3
value: 87.80799999999999
- type: recall_at_5
value: 92.161
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 63.490029238193756
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 65.13153408508836
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.202999999999999
- type: map_at_10
value: 10.174
- type: map_at_100
value: 12.138
- type: map_at_1000
value: 12.418
- type: map_at_3
value: 7.379
- type: map_at_5
value: 8.727
- type: mrr_at_1
value: 20.7
- type: mrr_at_10
value: 30.389
- type: mrr_at_100
value: 31.566
- type: mrr_at_1000
value: 31.637999999999998
- type: mrr_at_3
value: 27.133000000000003
- type: mrr_at_5
value: 29.078
- type: ndcg_at_1
value: 20.7
- type: ndcg_at_10
value: 17.355999999999998
- type: ndcg_at_100
value: 25.151
- type: ndcg_at_1000
value: 30.37
- type: ndcg_at_3
value: 16.528000000000002
- type: ndcg_at_5
value: 14.396999999999998
- type: precision_at_1
value: 20.7
- type: precision_at_10
value: 8.98
- type: precision_at_100
value: 2.015
- type: precision_at_1000
value: 0.327
- type: precision_at_3
value: 15.367
- type: precision_at_5
value: 12.559999999999999
- type: recall_at_1
value: 4.202999999999999
- type: recall_at_10
value: 18.197
- type: recall_at_100
value: 40.903
- type: recall_at_1000
value: 66.427
- type: recall_at_3
value: 9.362
- type: recall_at_5
value: 12.747
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_spearman
value: 81.69890989765257
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_spearman
value: 75.31953790551489
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_spearman
value: 87.44050861280759
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_spearman
value: 81.86922869270393
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_spearman
value: 88.9399170304284
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_spearman
value: 85.38015314088582
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_spearman
value: 90.53653527788835
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_spearman
value: 68.64526474250209
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_spearman
value: 86.56156983963042
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 79.48610254648003
- type: mrr
value: 94.02481505422682
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 48.983
- type: map_at_10
value: 59.077999999999996
- type: map_at_100
value: 59.536
- type: map_at_1000
value: 59.575
- type: map_at_3
value: 55.691
- type: map_at_5
value: 57.410000000000004
- type: mrr_at_1
value: 51.666999999999994
- type: mrr_at_10
value: 60.427
- type: mrr_at_100
value: 60.763
- type: mrr_at_1000
value: 60.79900000000001
- type: mrr_at_3
value: 57.556
- type: mrr_at_5
value: 59.089000000000006
- type: ndcg_at_1
value: 51.666999999999994
- type: ndcg_at_10
value: 64.559
- type: ndcg_at_100
value: 66.58
- type: ndcg_at_1000
value: 67.64
- type: ndcg_at_3
value: 58.287
- type: ndcg_at_5
value: 61.001000000000005
- type: precision_at_1
value: 51.666999999999994
- type: precision_at_10
value: 9.067
- type: precision_at_100
value: 1.0170000000000001
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_3
value: 23.0
- type: precision_at_5
value: 15.6
- type: recall_at_1
value: 48.983
- type: recall_at_10
value: 80.289
- type: recall_at_100
value: 89.43299999999999
- type: recall_at_1000
value: 97.667
- type: recall_at_3
value: 62.978
- type: recall_at_5
value: 69.872
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.79009900990098
- type: cos_sim_ap
value: 94.94115052608419
- type: cos_sim_f1
value: 89.1260162601626
- type: cos_sim_precision
value: 90.599173553719
- type: cos_sim_recall
value: 87.7
- type: dot_accuracy
value: 99.79009900990098
- type: dot_ap
value: 94.94115052608419
- type: dot_f1
value: 89.1260162601626
- type: dot_precision
value: 90.599173553719
- type: dot_recall
value: 87.7
- type: euclidean_accuracy
value: 99.79009900990098
- type: euclidean_ap
value: 94.94115052608419
- type: euclidean_f1
value: 89.1260162601626
- type: euclidean_precision
value: 90.599173553719
- type: euclidean_recall
value: 87.7
- type: manhattan_accuracy
value: 99.7940594059406
- type: manhattan_ap
value: 94.95271414642431
- type: manhattan_f1
value: 89.24508790072387
- type: manhattan_precision
value: 92.3982869379015
- type: manhattan_recall
value: 86.3
- type: max_accuracy
value: 99.7940594059406
- type: max_ap
value: 94.95271414642431
- type: max_f1
value: 89.24508790072387
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 68.43866571935851
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 35.16579026551532
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 52.518952473513934
- type: mrr
value: 53.292457134368895
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 31.12529588316604
- type: cos_sim_spearman
value: 32.31662126895294
- type: dot_pearson
value: 31.125303796647056
- type: dot_spearman
value: 32.31662126895294
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.219
- type: map_at_10
value: 1.7469999999999999
- type: map_at_100
value: 10.177999999999999
- type: map_at_1000
value: 26.108999999999998
- type: map_at_3
value: 0.64
- type: map_at_5
value: 0.968
- type: mrr_at_1
value: 82.0
- type: mrr_at_10
value: 89.067
- type: mrr_at_100
value: 89.067
- type: mrr_at_1000
value: 89.067
- type: mrr_at_3
value: 88.333
- type: mrr_at_5
value: 88.73299999999999
- type: ndcg_at_1
value: 78.0
- type: ndcg_at_10
value: 71.398
- type: ndcg_at_100
value: 55.574999999999996
- type: ndcg_at_1000
value: 51.771
- type: ndcg_at_3
value: 77.765
- type: ndcg_at_5
value: 73.614
- type: precision_at_1
value: 82.0
- type: precision_at_10
value: 75.4
- type: precision_at_100
value: 58.040000000000006
- type: precision_at_1000
value: 23.516000000000002
- type: precision_at_3
value: 84.0
- type: precision_at_5
value: 78.4
- type: recall_at_1
value: 0.219
- type: recall_at_10
value: 1.958
- type: recall_at_100
value: 13.797999999999998
- type: recall_at_1000
value: 49.881
- type: recall_at_3
value: 0.672
- type: recall_at_5
value: 1.0370000000000001
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 1.8610000000000002
- type: map_at_10
value: 8.705
- type: map_at_100
value: 15.164
- type: map_at_1000
value: 16.78
- type: map_at_3
value: 4.346
- type: map_at_5
value: 6.151
- type: mrr_at_1
value: 22.448999999999998
- type: mrr_at_10
value: 41.556
- type: mrr_at_100
value: 42.484
- type: mrr_at_1000
value: 42.494
- type: mrr_at_3
value: 37.755
- type: mrr_at_5
value: 40.102
- type: ndcg_at_1
value: 21.429000000000002
- type: ndcg_at_10
value: 23.439
- type: ndcg_at_100
value: 36.948
- type: ndcg_at_1000
value: 48.408
- type: ndcg_at_3
value: 22.261
- type: ndcg_at_5
value: 23.085
- type: precision_at_1
value: 22.448999999999998
- type: precision_at_10
value: 21.633
- type: precision_at_100
value: 8.02
- type: precision_at_1000
value: 1.5939999999999999
- type: precision_at_3
value: 23.810000000000002
- type: precision_at_5
value: 24.490000000000002
- type: recall_at_1
value: 1.8610000000000002
- type: recall_at_10
value: 15.876000000000001
- type: recall_at_100
value: 50.300999999999995
- type: recall_at_1000
value: 86.098
- type: recall_at_3
value: 5.892
- type: recall_at_5
value: 9.443
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 70.3264
- type: ap
value: 13.249577616243794
- type: f1
value: 53.621518367695685
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 61.57611771363894
- type: f1
value: 61.79797478568639
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 53.38315344479284
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 87.55438993860642
- type: cos_sim_ap
value: 77.98702600017738
- type: cos_sim_f1
value: 71.94971653931476
- type: cos_sim_precision
value: 67.50693802035153
- type: cos_sim_recall
value: 77.01846965699208
- type: dot_accuracy
value: 87.55438993860642
- type: dot_ap
value: 77.98702925907986
- type: dot_f1
value: 71.94971653931476
- type: dot_precision
value: 67.50693802035153
- type: dot_recall
value: 77.01846965699208
- type: euclidean_accuracy
value: 87.55438993860642
- type: euclidean_ap
value: 77.98702951957925
- type: euclidean_f1
value: 71.94971653931476
- type: euclidean_precision
value: 67.50693802035153
- type: euclidean_recall
value: 77.01846965699208
- type: manhattan_accuracy
value: 87.54246885617214
- type: manhattan_ap
value: 77.95531413902947
- type: manhattan_f1
value: 71.93605683836589
- type: manhattan_precision
value: 69.28152492668622
- type: manhattan_recall
value: 74.80211081794195
- type: max_accuracy
value: 87.55438993860642
- type: max_ap
value: 77.98702951957925
- type: max_f1
value: 71.94971653931476
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.47296930182016
- type: cos_sim_ap
value: 86.92853616302108
- type: cos_sim_f1
value: 79.35138351681047
- type: cos_sim_precision
value: 76.74820143884892
- type: cos_sim_recall
value: 82.13735756082538
- type: dot_accuracy
value: 89.47296930182016
- type: dot_ap
value: 86.92854339601595
- type: dot_f1
value: 79.35138351681047
- type: dot_precision
value: 76.74820143884892
- type: dot_recall
value: 82.13735756082538
- type: euclidean_accuracy
value: 89.47296930182016
- type: euclidean_ap
value: 86.92854191061649
- type: euclidean_f1
value: 79.35138351681047
- type: euclidean_precision
value: 76.74820143884892
- type: euclidean_recall
value: 82.13735756082538
- type: manhattan_accuracy
value: 89.47685023479644
- type: manhattan_ap
value: 86.90063722679578
- type: manhattan_f1
value: 79.30753865502702
- type: manhattan_precision
value: 76.32066068631639
- type: manhattan_recall
value: 82.53772713273791
- type: max_accuracy
value: 89.47685023479644
- type: max_ap
value: 86.92854339601595
- type: max_f1
value: 79.35138351681047
---
clone of hkunlp/instructor with added requirements.txt for inference endpoint and handler that allows use of langchain
# hkunlp/instructor-xl
We introduce **Instructor**👨🏫, an instruction-finetuned text embedding model that can generate text embeddings tailored to any task (e.g., classification, retrieval, clustering, text evaluation, etc.) and domains (e.g., science, finance, etc.) ***by simply providing the task instruction, without any finetuning***. Instructor👨 achieves sota on 70 diverse embedding tasks!
The model is easy to use with **our customized** `sentence-transformer` library. For more details, check out [our paper](https://arxiv.org/abs/2212.09741) and [project page](https://instructor-embedding.github.io/)!
**************************** **Updates** ****************************
* 01/21: We released a new [checkpoint](https://huggingface.co/hkunlp/instructor-xl) trained with hard negatives, which gives better performance.
* 12/21: We released our [paper](https://arxiv.org/abs/2212.09741), [code](https://github.com/HKUNLP/instructor-embedding), [checkpoint](https://huggingface.co/hkunlp/instructor-xl) and [project page](https://instructor-embedding.github.io/)! Check them out!
## Quick start
<hr />
## Installation
```bash
pip install InstructorEmbedding
```
## Compute your customized embeddings
Then you can use the model like this to calculate domain-specific and task-aware embeddings:
```python
from InstructorEmbedding import INSTRUCTOR
model = INSTRUCTOR('hkunlp/instructor-xl')
sentence = "3D ActionSLAM: wearable person tracking in multi-floor environments"
instruction = "Represent the Science title:"
embeddings = model.encode([[instruction,sentence]])
print(embeddings)
```
## Use cases
<hr />
## Calculate embeddings for your customized texts
If you want to calculate customized embeddings for specific sentences, you may follow the unified template to write instructions:
Represent the `domain` `text_type` for `task_objective`:
* `domain` is optional, and it specifies the domain of the text, e.g., science, finance, medicine, etc.
* `text_type` is required, and it specifies the encoding unit, e.g., sentence, document, paragraph, etc.
* `task_objective` is optional, and it specifies the objective of embedding, e.g., retrieve a document, classify the sentence, etc.
## Calculate Sentence similarities
You can further use the model to compute similarities between two groups of sentences, with **customized embeddings**.
```python
from sklearn.metrics.pairwise import cosine_similarity
sentences_a = [['Represent the Science sentence: ','Parton energy loss in QCD matter'],
['Represent the Financial statement: ','The Federal Reserve on Wednesday raised its benchmark interest rate.']]
sentences_b = [['Represent the Science sentence: ','The Chiral Phase Transition in Dissipative Dynamics'],
['Represent the Financial statement: ','The funds rose less than 0.5 per cent on Friday']]
embeddings_a = model.encode(sentences_a)
embeddings_b = model.encode(sentences_b)
similarities = cosine_similarity(embeddings_a,embeddings_b)
print(similarities)
```
## Information Retrieval
You can also use **customized embeddings** for information retrieval.
```python
import numpy as np
from sklearn.metrics.pairwise import cosine_similarity
query = [['Represent the Wikipedia question for retrieving supporting documents: ','where is the food stored in a yam plant']]
corpus = [['Represent the Wikipedia document for retrieval: ','Capitalism has been dominant in the Western world since the end of feudalism, but most feel[who?] that the term "mixed economies" more precisely describes most contemporary economies, due to their containing both private-owned and state-owned enterprises. In capitalism, prices determine the demand-supply scale. For example, higher demand for certain goods and services lead to higher prices and lower demand for certain goods lead to lower prices.'],
['Represent the Wikipedia document for retrieval: ',"The disparate impact theory is especially controversial under the Fair Housing Act because the Act regulates many activities relating to housing, insurance, and mortgage loans—and some scholars have argued that the theory's use under the Fair Housing Act, combined with extensions of the Community Reinvestment Act, contributed to rise of sub-prime lending and the crash of the U.S. housing market and ensuing global economic recession"],
['Represent the Wikipedia document for retrieval: ','Disparate impact in United States labor law refers to practices in employment, housing, and other areas that adversely affect one group of people of a protected characteristic more than another, even though rules applied by employers or landlords are formally neutral. Although the protected classes vary by statute, most federal civil rights laws protect based on race, color, religion, national origin, and sex as protected traits, and some laws include disability status and other traits as well.']]
query_embeddings = model.encode(query)
corpus_embeddings = model.encode(corpus)
similarities = cosine_similarity(query_embeddings,corpus_embeddings)
retrieved_doc_id = np.argmax(similarities)
print(retrieved_doc_id)
```
## Clustering
Use **customized embeddings** for clustering texts in groups.
```python
import sklearn.cluster
sentences = [['Represent the Medicine sentence for clustering: ','Dynamical Scalar Degree of Freedom in Horava-Lifshitz Gravity'],
['Represent the Medicine sentence for clustering: ','Comparison of Atmospheric Neutrino Flux Calculations at Low Energies'],
['Represent the Medicine sentence for clustering: ','Fermion Bags in the Massive Gross-Neveu Model'],
['Represent the Medicine sentence for clustering: ',"QCD corrections to Associated t-tbar-H production at the Tevatron"],
['Represent the Medicine sentence for clustering: ','A New Analysis of the R Measurements: Resonance Parameters of the Higher, Vector States of Charmonium']]
embeddings = model.encode(sentences)
clustering_model = sklearn.cluster.MiniBatchKMeans(n_clusters=2)
clustering_model.fit(embeddings)
cluster_assignment = clustering_model.labels_
print(cluster_assignment)
``` | [
"SUMMARIZATION"
]
| [
"BIOSSES",
"SCIFACT"
]
| Non_BioNLP | clone of hkunlp/instructor with added requirements.txt for inference endpoint and handler that allows use of langchain
# hkunlp/instructor-xl
We introduce **Instructor**👨🏫, an instruction-finetuned text embedding model that can generate text embeddings tailored to any task (e.g., classification, retrieval, clustering, text evaluation, etc.) and domains (e.g., science, finance, etc.) ***by simply providing the task instruction, without any finetuning***. Instructor👨 achieves sota on 70 diverse embedding tasks!
The model is easy to use with **our customized** `sentence-transformer` library. For more details, check out [our paper](https://arxiv.org/abs/2212.09741) and [project page](https://instructor-embedding.github.io/)!
**************************** **Updates** ****************************
* 01/21: We released a new [checkpoint](https://huggingface.co/hkunlp/instructor-xl) trained with hard negatives, which gives better performance.
* 12/21: We released our [paper](https://arxiv.org/abs/2212.09741), [code](https://github.com/HKUNLP/instructor-embedding), [checkpoint](https://huggingface.co/hkunlp/instructor-xl) and [project page](https://instructor-embedding.github.io/)! Check them out!
## Quick start
<hr />
## Installation
```bash
pip install InstructorEmbedding
```
## Compute your customized embeddings
Then you can use the model like this to calculate domain-specific and task-aware embeddings:
```python
from InstructorEmbedding import INSTRUCTOR
model = INSTRUCTOR('hkunlp/instructor-xl')
sentence = "3D ActionSLAM: wearable person tracking in multi-floor environments"
instruction = "Represent the Science title:"
embeddings = model.encode([[instruction,sentence]])
print(embeddings)
```
## Use cases
<hr />
## Calculate embeddings for your customized texts
If you want to calculate customized embeddings for specific sentences, you may follow the unified template to write instructions:
Represent the `domain` `text_type` for `task_objective`:
* `domain` is optional, and it specifies the domain of the text, e.g., science, finance, medicine, etc.
* `text_type` is required, and it specifies the encoding unit, e.g., sentence, document, paragraph, etc.
* `task_objective` is optional, and it specifies the objective of embedding, e.g., retrieve a document, classify the sentence, etc.
## Calculate Sentence similarities
You can further use the model to compute similarities between two groups of sentences, with **customized embeddings**.
```python
from sklearn.metrics.pairwise import cosine_similarity
sentences_a = [['Represent the Science sentence: ','Parton energy loss in QCD matter'],
['Represent the Financial statement: ','The Federal Reserve on Wednesday raised its benchmark interest rate.']]
sentences_b = [['Represent the Science sentence: ','The Chiral Phase Transition in Dissipative Dynamics'],
['Represent the Financial statement: ','The funds rose less than 0.5 per cent on Friday']]
embeddings_a = model.encode(sentences_a)
embeddings_b = model.encode(sentences_b)
similarities = cosine_similarity(embeddings_a,embeddings_b)
print(similarities)
```
## Information Retrieval
You can also use **customized embeddings** for information retrieval.
```python
import numpy as np
from sklearn.metrics.pairwise import cosine_similarity
query = [['Represent the Wikipedia question for retrieving supporting documents: ','where is the food stored in a yam plant']]
corpus = [['Represent the Wikipedia document for retrieval: ','Capitalism has been dominant in the Western world since the end of feudalism, but most feel[who?] that the term "mixed economies" more precisely describes most contemporary economies, due to their containing both private-owned and state-owned enterprises. In capitalism, prices determine the demand-supply scale. For example, higher demand for certain goods and services lead to higher prices and lower demand for certain goods lead to lower prices.'],
['Represent the Wikipedia document for retrieval: ',"The disparate impact theory is especially controversial under the Fair Housing Act because the Act regulates many activities relating to housing, insurance, and mortgage loans—and some scholars have argued that the theory's use under the Fair Housing Act, combined with extensions of the Community Reinvestment Act, contributed to rise of sub-prime lending and the crash of the U.S. housing market and ensuing global economic recession"],
['Represent the Wikipedia document for retrieval: ','Disparate impact in United States labor law refers to practices in employment, housing, and other areas that adversely affect one group of people of a protected characteristic more than another, even though rules applied by employers or landlords are formally neutral. Although the protected classes vary by statute, most federal civil rights laws protect based on race, color, religion, national origin, and sex as protected traits, and some laws include disability status and other traits as well.']]
query_embeddings = model.encode(query)
corpus_embeddings = model.encode(corpus)
similarities = cosine_similarity(query_embeddings,corpus_embeddings)
retrieved_doc_id = np.argmax(similarities)
print(retrieved_doc_id)
```
## Clustering
Use **customized embeddings** for clustering texts in groups.
```python
import sklearn.cluster
sentences = [['Represent the Medicine sentence for clustering: ','Dynamical Scalar Degree of Freedom in Horava-Lifshitz Gravity'],
['Represent the Medicine sentence for clustering: ','Comparison of Atmospheric Neutrino Flux Calculations at Low Energies'],
['Represent the Medicine sentence for clustering: ','Fermion Bags in the Massive Gross-Neveu Model'],
['Represent the Medicine sentence for clustering: ',"QCD corrections to Associated t-tbar-H production at the Tevatron"],
['Represent the Medicine sentence for clustering: ','A New Analysis of the R Measurements: Resonance Parameters of the Higher, Vector States of Charmonium']]
embeddings = model.encode(sentences)
clustering_model = sklearn.cluster.MiniBatchKMeans(n_clusters=2)
clustering_model.fit(embeddings)
cluster_assignment = clustering_model.labels_
print(cluster_assignment)
``` | {"language": "en", "license": "apache-2.0", "pipeline_tag": "sentence-similarity", "tags": ["text-embedding", "embeddings", "information-retrieval", "beir", "text-classification", "language-model", "text-clustering", "text-semantic-similarity", "text-evaluation", "prompt-retrieval", "text-reranking", "sentence-transformers", "feature-extraction", "sentence-similarity", "transformers", "t5", "English", "Sentence Similarity", "natural_questions", "ms_marco", "fever", "hotpot_qa", "mteb"], "inference": false, "model-index": [{"name": "final_xl_results", "results": [{"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonCounterfactualClassification (en)", "type": "mteb/amazon_counterfactual", "config": "en", "split": "test", "revision": "e8379541af4e31359cca9fbcf4b00f2671dba205"}, "metrics": [{"type": "accuracy", "value": 85.08955223880596}, {"type": "ap", "value": 52.66066378722476}, {"type": "f1", "value": 79.63340218960269}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonPolarityClassification", "type": "mteb/amazon_polarity", "config": "default", "split": "test", "revision": "e2d317d38cd51312af73b3d32a06d1a08b442046"}, "metrics": [{"type": "accuracy", "value": 86.542}, {"type": "ap", "value": 81.92695193008987}, {"type": "f1", "value": 86.51466132573681}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification (en)", "type": "mteb/amazon_reviews_multi", "config": "en", "split": "test", "revision": "1399c76144fd37290681b995c656ef9b2e06e26d"}, "metrics": [{"type": "accuracy", "value": 42.964}, {"type": "f1", "value": 41.43146249774862}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ArguAna", "type": "arguana", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 29.872}, {"type": "map_at_10", "value": 46.342}, {"type": "map_at_100", "value": 47.152}, {"type": "map_at_1000", "value": 47.154}, {"type": "map_at_3", "value": 41.216}, {"type": "map_at_5", "value": 44.035999999999994}, {"type": "mrr_at_1", "value": 30.939}, {"type": "mrr_at_10", "value": 46.756}, {"type": "mrr_at_100", "value": 47.573}, {"type": "mrr_at_1000", "value": 47.575}, {"type": "mrr_at_3", "value": 41.548}, {"type": "mrr_at_5", "value": 44.425}, {"type": "ndcg_at_1", "value": 29.872}, {"type": "ndcg_at_10", "value": 55.65}, {"type": "ndcg_at_100", "value": 58.88099999999999}, {"type": "ndcg_at_1000", "value": 58.951}, {"type": "ndcg_at_3", "value": 45.0}, {"type": "ndcg_at_5", "value": 50.09}, {"type": "precision_at_1", "value": 29.872}, {"type": "precision_at_10", "value": 8.549}, {"type": "precision_at_100", "value": 0.991}, {"type": "precision_at_1000", "value": 0.1}, {"type": "precision_at_3", "value": 18.658}, {"type": "precision_at_5", "value": 13.669999999999998}, {"type": "recall_at_1", "value": 29.872}, {"type": "recall_at_10", "value": 85.491}, {"type": "recall_at_100", "value": 99.075}, {"type": "recall_at_1000", "value": 99.644}, {"type": "recall_at_3", "value": 55.974000000000004}, {"type": "recall_at_5", "value": 68.35}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringP2P", "type": "mteb/arxiv-clustering-p2p", "config": "default", "split": "test", "revision": "a122ad7f3f0291bf49cc6f4d32aa80929df69d5d"}, "metrics": [{"type": "v_measure", "value": 42.452729850641276}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringS2S", "type": "mteb/arxiv-clustering-s2s", "config": "default", "split": "test", "revision": "f910caf1a6075f7329cdf8c1a6135696f37dbd53"}, "metrics": [{"type": "v_measure", "value": 32.21141846480423}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB AskUbuntuDupQuestions", "type": "mteb/askubuntudupquestions-reranking", "config": "default", "split": "test", "revision": "2000358ca161889fa9c082cb41daa8dcfb161a54"}, "metrics": [{"type": "map", "value": 65.34710928952622}, {"type": "mrr", "value": 77.61124301983028}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB BIOSSES", "type": "mteb/biosses-sts", "config": "default", "split": "test", "revision": "d3fb88f8f02e40887cd149695127462bbcf29b4a"}, "metrics": [{"type": "cos_sim_spearman", "value": 84.15312230525639}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB Banking77Classification", "type": "mteb/banking77", "config": "default", "split": "test", "revision": "0fd18e25b25c072e09e0d92ab615fda904d66300"}, "metrics": [{"type": "accuracy", "value": 82.66233766233766}, {"type": "f1", "value": 82.04175284777669}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringP2P", "type": "mteb/biorxiv-clustering-p2p", "config": "default", "split": "test", "revision": "65b79d1d13f80053f67aca9498d9402c2d9f1f40"}, "metrics": [{"type": "v_measure", "value": 37.36697339826455}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringS2S", "type": "mteb/biorxiv-clustering-s2s", "config": "default", "split": "test", "revision": "258694dd0231531bc1fd9de6ceb52a0853c6d908"}, "metrics": [{"type": "v_measure", "value": 30.551241447593092}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackAndroidRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 36.797000000000004}, {"type": "map_at_10", "value": 48.46}, {"type": "map_at_100", "value": 49.968}, {"type": "map_at_1000", "value": 50.080000000000005}, {"type": "map_at_3", "value": 44.71}, {"type": "map_at_5", "value": 46.592}, {"type": "mrr_at_1", "value": 45.494}, {"type": "mrr_at_10", "value": 54.747}, {"type": "mrr_at_100", "value": 55.43599999999999}, {"type": "mrr_at_1000", "value": 55.464999999999996}, {"type": "mrr_at_3", "value": 52.361000000000004}, {"type": "mrr_at_5", "value": 53.727000000000004}, {"type": "ndcg_at_1", "value": 45.494}, {"type": "ndcg_at_10", "value": 54.989}, {"type": "ndcg_at_100", "value": 60.096000000000004}, {"type": "ndcg_at_1000", "value": 61.58}, {"type": "ndcg_at_3", "value": 49.977}, {"type": "ndcg_at_5", "value": 51.964999999999996}, {"type": "precision_at_1", "value": 45.494}, {"type": "precision_at_10", "value": 10.558}, {"type": "precision_at_100", "value": 1.6049999999999998}, {"type": "precision_at_1000", "value": 0.203}, {"type": "precision_at_3", "value": 23.796}, {"type": "precision_at_5", "value": 16.881}, {"type": "recall_at_1", "value": 36.797000000000004}, {"type": "recall_at_10", "value": 66.83}, {"type": "recall_at_100", "value": 88.34100000000001}, {"type": "recall_at_1000", "value": 97.202}, {"type": "recall_at_3", "value": 51.961999999999996}, {"type": "recall_at_5", "value": 57.940000000000005}, {"type": "map_at_1", "value": 32.597}, {"type": "map_at_10", "value": 43.424}, {"type": "map_at_100", "value": 44.78}, {"type": "map_at_1000", "value": 44.913}, {"type": "map_at_3", "value": 40.315}, {"type": "map_at_5", "value": 41.987}, {"type": "mrr_at_1", "value": 40.382}, {"type": "mrr_at_10", "value": 49.219}, {"type": "mrr_at_100", "value": 49.895}, {"type": "mrr_at_1000", "value": 49.936}, {"type": "mrr_at_3", "value": 46.996}, {"type": "mrr_at_5", "value": 48.231}, {"type": "ndcg_at_1", "value": 40.382}, {"type": "ndcg_at_10", "value": 49.318}, {"type": "ndcg_at_100", "value": 53.839999999999996}, {"type": "ndcg_at_1000", "value": 55.82899999999999}, {"type": "ndcg_at_3", "value": 44.914}, {"type": "ndcg_at_5", "value": 46.798}, {"type": "precision_at_1", "value": 40.382}, {"type": "precision_at_10", "value": 9.274000000000001}, {"type": "precision_at_100", "value": 1.497}, {"type": "precision_at_1000", "value": 0.198}, {"type": "precision_at_3", "value": 21.592}, {"type": "precision_at_5", "value": 15.159}, {"type": "recall_at_1", "value": 32.597}, {"type": "recall_at_10", "value": 59.882000000000005}, {"type": "recall_at_100", "value": 78.446}, {"type": "recall_at_1000", "value": 90.88000000000001}, {"type": "recall_at_3", "value": 46.9}, {"type": "recall_at_5", "value": 52.222}, {"type": "map_at_1", "value": 43.8}, {"type": "map_at_10", "value": 57.293000000000006}, {"type": "map_at_100", "value": 58.321}, {"type": "map_at_1000", "value": 58.361}, {"type": "map_at_3", "value": 53.839999999999996}, {"type": "map_at_5", "value": 55.838}, {"type": "mrr_at_1", "value": 49.592000000000006}, {"type": "mrr_at_10", "value": 60.643}, {"type": "mrr_at_100", "value": 61.23499999999999}, {"type": "mrr_at_1000", "value": 61.251999999999995}, {"type": "mrr_at_3", "value": 58.265}, {"type": "mrr_at_5", "value": 59.717}, {"type": "ndcg_at_1", "value": 49.592000000000006}, {"type": "ndcg_at_10", "value": 63.364}, {"type": "ndcg_at_100", "value": 67.167}, {"type": "ndcg_at_1000", "value": 67.867}, {"type": "ndcg_at_3", "value": 57.912}, {"type": "ndcg_at_5", "value": 60.697}, {"type": "precision_at_1", "value": 49.592000000000006}, {"type": "precision_at_10", "value": 10.088}, {"type": "precision_at_100", "value": 1.2930000000000001}, {"type": "precision_at_1000", "value": 0.13899999999999998}, {"type": "precision_at_3", "value": 25.789}, {"type": "precision_at_5", "value": 17.541999999999998}, {"type": "recall_at_1", "value": 43.8}, {"type": "recall_at_10", "value": 77.635}, {"type": "recall_at_100", "value": 93.748}, {"type": "recall_at_1000", "value": 98.468}, {"type": "recall_at_3", "value": 63.223}, {"type": "recall_at_5", "value": 70.122}, {"type": "map_at_1", "value": 27.721}, {"type": "map_at_10", "value": 35.626999999999995}, {"type": "map_at_100", "value": 36.719}, {"type": "map_at_1000", "value": 36.8}, {"type": "map_at_3", "value": 32.781}, {"type": "map_at_5", "value": 34.333999999999996}, {"type": "mrr_at_1", "value": 29.604999999999997}, {"type": "mrr_at_10", "value": 37.564}, {"type": "mrr_at_100", "value": 38.505}, {"type": "mrr_at_1000", "value": 38.565}, {"type": "mrr_at_3", "value": 34.727000000000004}, {"type": "mrr_at_5", "value": 36.207}, {"type": "ndcg_at_1", "value": 29.604999999999997}, {"type": "ndcg_at_10", "value": 40.575}, {"type": "ndcg_at_100", "value": 45.613}, {"type": "ndcg_at_1000", "value": 47.676}, {"type": "ndcg_at_3", "value": 34.811}, {"type": "ndcg_at_5", "value": 37.491}, {"type": "precision_at_1", "value": 29.604999999999997}, {"type": "precision_at_10", "value": 6.1690000000000005}, {"type": "precision_at_100", "value": 0.906}, {"type": "precision_at_1000", "value": 0.11199999999999999}, {"type": "precision_at_3", "value": 14.237}, {"type": "precision_at_5", "value": 10.056}, {"type": "recall_at_1", "value": 27.721}, {"type": "recall_at_10", "value": 54.041}, {"type": "recall_at_100", "value": 76.62299999999999}, {"type": "recall_at_1000", "value": 92.134}, {"type": "recall_at_3", "value": 38.582}, {"type": "recall_at_5", "value": 44.989000000000004}, {"type": "map_at_1", "value": 16.553}, {"type": "map_at_10", "value": 25.384}, {"type": "map_at_100", "value": 26.655}, {"type": "map_at_1000", "value": 26.778000000000002}, {"type": "map_at_3", "value": 22.733}, {"type": "map_at_5", "value": 24.119}, {"type": "mrr_at_1", "value": 20.149}, {"type": "mrr_at_10", "value": 29.705}, {"type": "mrr_at_100", "value": 30.672}, {"type": "mrr_at_1000", "value": 30.737}, {"type": "mrr_at_3", "value": 27.032}, {"type": "mrr_at_5", "value": 28.369}, {"type": "ndcg_at_1", "value": 20.149}, {"type": "ndcg_at_10", "value": 30.843999999999998}, {"type": "ndcg_at_100", "value": 36.716}, {"type": "ndcg_at_1000", "value": 39.495000000000005}, {"type": "ndcg_at_3", "value": 25.918999999999997}, {"type": "ndcg_at_5", "value": 27.992}, {"type": "precision_at_1", "value": 20.149}, {"type": "precision_at_10", "value": 5.858}, {"type": "precision_at_100", "value": 1.009}, {"type": "precision_at_1000", "value": 0.13799999999999998}, {"type": "precision_at_3", "value": 12.645000000000001}, {"type": "precision_at_5", "value": 9.179}, {"type": "recall_at_1", "value": 16.553}, {"type": "recall_at_10", "value": 43.136}, {"type": "recall_at_100", "value": 68.562}, {"type": "recall_at_1000", "value": 88.208}, {"type": "recall_at_3", "value": 29.493000000000002}, {"type": "recall_at_5", "value": 34.751}, {"type": "map_at_1", "value": 28.000999999999998}, {"type": "map_at_10", "value": 39.004}, {"type": "map_at_100", "value": 40.461999999999996}, {"type": "map_at_1000", "value": 40.566}, {"type": "map_at_3", "value": 35.805}, {"type": "map_at_5", "value": 37.672}, {"type": "mrr_at_1", "value": 33.782000000000004}, {"type": "mrr_at_10", "value": 44.702}, {"type": "mrr_at_100", "value": 45.528}, {"type": "mrr_at_1000", "value": 45.576}, {"type": "mrr_at_3", "value": 42.14}, {"type": "mrr_at_5", "value": 43.651}, {"type": "ndcg_at_1", "value": 33.782000000000004}, {"type": "ndcg_at_10", "value": 45.275999999999996}, {"type": "ndcg_at_100", "value": 50.888}, {"type": "ndcg_at_1000", "value": 52.879}, {"type": "ndcg_at_3", "value": 40.191}, {"type": "ndcg_at_5", "value": 42.731}, {"type": "precision_at_1", "value": 33.782000000000004}, {"type": "precision_at_10", "value": 8.200000000000001}, {"type": "precision_at_100", "value": 1.287}, {"type": "precision_at_1000", "value": 0.16199999999999998}, {"type": "precision_at_3", "value": 19.185}, {"type": "precision_at_5", "value": 13.667000000000002}, {"type": "recall_at_1", "value": 28.000999999999998}, {"type": "recall_at_10", "value": 58.131}, {"type": "recall_at_100", "value": 80.869}, {"type": "recall_at_1000", "value": 93.931}, {"type": "recall_at_3", "value": 44.161}, {"type": "recall_at_5", "value": 50.592000000000006}, {"type": "map_at_1", "value": 28.047}, {"type": "map_at_10", "value": 38.596000000000004}, {"type": "map_at_100", "value": 40.116}, {"type": "map_at_1000", "value": 40.232}, {"type": "map_at_3", "value": 35.205}, {"type": "map_at_5", "value": 37.076}, {"type": "mrr_at_1", "value": 34.932}, {"type": "mrr_at_10", "value": 44.496}, {"type": "mrr_at_100", "value": 45.47}, {"type": "mrr_at_1000", "value": 45.519999999999996}, {"type": "mrr_at_3", "value": 41.743}, {"type": "mrr_at_5", "value": 43.352000000000004}, {"type": "ndcg_at_1", "value": 34.932}, {"type": "ndcg_at_10", "value": 44.901}, {"type": "ndcg_at_100", "value": 50.788999999999994}, {"type": "ndcg_at_1000", "value": 52.867}, {"type": "ndcg_at_3", "value": 39.449}, {"type": "ndcg_at_5", "value": 41.929}, {"type": "precision_at_1", "value": 34.932}, {"type": "precision_at_10", "value": 8.311}, {"type": "precision_at_100", "value": 1.3050000000000002}, {"type": "precision_at_1000", "value": 0.166}, {"type": "precision_at_3", "value": 18.836}, {"type": "precision_at_5", "value": 13.447000000000001}, {"type": "recall_at_1", "value": 28.047}, {"type": "recall_at_10", "value": 57.717}, {"type": "recall_at_100", "value": 82.182}, {"type": "recall_at_1000", "value": 95.82000000000001}, {"type": "recall_at_3", "value": 42.448}, {"type": "recall_at_5", "value": 49.071}, {"type": "map_at_1", "value": 27.861250000000005}, {"type": "map_at_10", "value": 37.529583333333335}, {"type": "map_at_100", "value": 38.7915}, {"type": "map_at_1000", "value": 38.90558333333335}, {"type": "map_at_3", "value": 34.57333333333333}, {"type": "map_at_5", "value": 36.187166666666656}, {"type": "mrr_at_1", "value": 32.88291666666666}, {"type": "mrr_at_10", "value": 41.79750000000001}, {"type": "mrr_at_100", "value": 42.63183333333333}, {"type": "mrr_at_1000", "value": 42.68483333333333}, {"type": "mrr_at_3", "value": 39.313750000000006}, {"type": "mrr_at_5", "value": 40.70483333333333}, {"type": "ndcg_at_1", "value": 32.88291666666666}, {"type": "ndcg_at_10", "value": 43.09408333333333}, {"type": "ndcg_at_100", "value": 48.22158333333333}, {"type": "ndcg_at_1000", "value": 50.358000000000004}, {"type": "ndcg_at_3", "value": 38.129583333333336}, {"type": "ndcg_at_5", "value": 40.39266666666666}, {"type": "precision_at_1", "value": 32.88291666666666}, {"type": "precision_at_10", "value": 7.5584999999999996}, {"type": "precision_at_100", "value": 1.1903333333333332}, {"type": "precision_at_1000", "value": 0.15658333333333332}, {"type": "precision_at_3", "value": 17.495916666666666}, {"type": "precision_at_5", "value": 12.373833333333332}, {"type": "recall_at_1", "value": 27.861250000000005}, {"type": "recall_at_10", "value": 55.215916666666665}, {"type": "recall_at_100", "value": 77.392}, {"type": "recall_at_1000", "value": 92.04908333333334}, {"type": "recall_at_3", "value": 41.37475}, {"type": "recall_at_5", "value": 47.22908333333333}, {"type": "map_at_1", "value": 25.064999999999998}, {"type": "map_at_10", "value": 31.635999999999996}, {"type": "map_at_100", "value": 32.596000000000004}, {"type": "map_at_1000", "value": 32.695}, {"type": "map_at_3", "value": 29.612}, {"type": "map_at_5", "value": 30.768}, {"type": "mrr_at_1", "value": 28.528}, {"type": "mrr_at_10", "value": 34.717}, {"type": "mrr_at_100", "value": 35.558}, {"type": "mrr_at_1000", "value": 35.626000000000005}, {"type": "mrr_at_3", "value": 32.745000000000005}, {"type": "mrr_at_5", "value": 33.819}, {"type": "ndcg_at_1", "value": 28.528}, {"type": "ndcg_at_10", "value": 35.647}, {"type": "ndcg_at_100", "value": 40.207}, {"type": "ndcg_at_1000", "value": 42.695}, {"type": "ndcg_at_3", "value": 31.878}, {"type": "ndcg_at_5", "value": 33.634}, {"type": "precision_at_1", "value": 28.528}, {"type": "precision_at_10", "value": 5.46}, {"type": "precision_at_100", "value": 0.84}, {"type": "precision_at_1000", "value": 0.11399999999999999}, {"type": "precision_at_3", "value": 13.547999999999998}, {"type": "precision_at_5", "value": 9.325}, {"type": "recall_at_1", "value": 25.064999999999998}, {"type": "recall_at_10", "value": 45.096000000000004}, {"type": "recall_at_100", "value": 65.658}, {"type": "recall_at_1000", "value": 84.128}, {"type": "recall_at_3", "value": 34.337}, {"type": "recall_at_5", "value": 38.849000000000004}, {"type": "map_at_1", "value": 17.276}, {"type": "map_at_10", "value": 24.535}, {"type": "map_at_100", "value": 25.655}, {"type": "map_at_1000", "value": 25.782}, {"type": "map_at_3", "value": 22.228}, {"type": "map_at_5", "value": 23.612}, {"type": "mrr_at_1", "value": 21.266}, {"type": "mrr_at_10", "value": 28.474}, {"type": "mrr_at_100", "value": 29.398000000000003}, {"type": "mrr_at_1000", "value": 29.482000000000003}, {"type": "mrr_at_3", "value": 26.245}, {"type": "mrr_at_5", "value": 27.624}, {"type": "ndcg_at_1", "value": 21.266}, {"type": "ndcg_at_10", "value": 29.087000000000003}, {"type": "ndcg_at_100", "value": 34.374}, {"type": "ndcg_at_1000", "value": 37.433}, {"type": "ndcg_at_3", "value": 25.040000000000003}, {"type": "ndcg_at_5", "value": 27.116}, {"type": "precision_at_1", "value": 21.266}, {"type": "precision_at_10", "value": 5.258}, {"type": "precision_at_100", "value": 0.9299999999999999}, {"type": "precision_at_1000", "value": 0.13699999999999998}, {"type": "precision_at_3", "value": 11.849}, {"type": "precision_at_5", "value": 8.699}, {"type": "recall_at_1", "value": 17.276}, {"type": "recall_at_10", "value": 38.928000000000004}, {"type": "recall_at_100", "value": 62.529}, {"type": "recall_at_1000", "value": 84.44800000000001}, {"type": "recall_at_3", "value": 27.554000000000002}, {"type": "recall_at_5", "value": 32.915}, {"type": "map_at_1", "value": 27.297}, {"type": "map_at_10", "value": 36.957}, {"type": "map_at_100", "value": 38.252}, {"type": "map_at_1000", "value": 38.356}, {"type": "map_at_3", "value": 34.121}, {"type": "map_at_5", "value": 35.782000000000004}, {"type": "mrr_at_1", "value": 32.275999999999996}, {"type": "mrr_at_10", "value": 41.198}, {"type": "mrr_at_100", "value": 42.131}, {"type": "mrr_at_1000", "value": 42.186}, {"type": "mrr_at_3", "value": 38.557}, {"type": "mrr_at_5", "value": 40.12}, {"type": "ndcg_at_1", "value": 32.275999999999996}, {"type": "ndcg_at_10", "value": 42.516}, {"type": "ndcg_at_100", "value": 48.15}, {"type": "ndcg_at_1000", "value": 50.344}, {"type": "ndcg_at_3", "value": 37.423}, {"type": "ndcg_at_5", "value": 39.919}, {"type": "precision_at_1", "value": 32.275999999999996}, {"type": "precision_at_10", "value": 7.155}, {"type": "precision_at_100", "value": 1.123}, {"type": "precision_at_1000", "value": 0.14200000000000002}, {"type": "precision_at_3", "value": 17.163999999999998}, {"type": "precision_at_5", "value": 12.127}, {"type": "recall_at_1", "value": 27.297}, {"type": "recall_at_10", "value": 55.238}, {"type": "recall_at_100", "value": 79.2}, {"type": "recall_at_1000", "value": 94.258}, {"type": "recall_at_3", "value": 41.327000000000005}, {"type": "recall_at_5", "value": 47.588}, {"type": "map_at_1", "value": 29.142000000000003}, {"type": "map_at_10", "value": 38.769}, {"type": "map_at_100", "value": 40.292}, {"type": "map_at_1000", "value": 40.510000000000005}, {"type": "map_at_3", "value": 35.39}, {"type": "map_at_5", "value": 37.009}, {"type": "mrr_at_1", "value": 34.19}, {"type": "mrr_at_10", "value": 43.418}, {"type": "mrr_at_100", "value": 44.132}, {"type": "mrr_at_1000", "value": 44.175}, {"type": "mrr_at_3", "value": 40.547}, {"type": "mrr_at_5", "value": 42.088}, {"type": "ndcg_at_1", "value": 34.19}, {"type": "ndcg_at_10", "value": 45.14}, {"type": "ndcg_at_100", "value": 50.364}, {"type": "ndcg_at_1000", "value": 52.481}, {"type": "ndcg_at_3", "value": 39.466}, {"type": "ndcg_at_5", "value": 41.772}, {"type": "precision_at_1", "value": 34.19}, {"type": "precision_at_10", "value": 8.715}, {"type": "precision_at_100", "value": 1.6150000000000002}, {"type": "precision_at_1000", "value": 0.247}, {"type": "precision_at_3", "value": 18.248}, {"type": "precision_at_5", "value": 13.161999999999999}, {"type": "recall_at_1", "value": 29.142000000000003}, {"type": "recall_at_10", "value": 57.577999999999996}, {"type": "recall_at_100", "value": 81.428}, {"type": "recall_at_1000", "value": 94.017}, {"type": "recall_at_3", "value": 41.402}, {"type": "recall_at_5", "value": 47.695}, {"type": "map_at_1", "value": 22.039}, {"type": "map_at_10", "value": 30.669999999999998}, {"type": "map_at_100", "value": 31.682}, {"type": "map_at_1000", "value": 31.794}, {"type": "map_at_3", "value": 28.139999999999997}, {"type": "map_at_5", "value": 29.457}, {"type": "mrr_at_1", "value": 24.399}, {"type": "mrr_at_10", "value": 32.687}, {"type": "mrr_at_100", "value": 33.622}, {"type": "mrr_at_1000", "value": 33.698}, {"type": "mrr_at_3", "value": 30.407}, {"type": "mrr_at_5", "value": 31.552999999999997}, {"type": "ndcg_at_1", "value": 24.399}, {"type": "ndcg_at_10", "value": 35.472}, {"type": "ndcg_at_100", "value": 40.455000000000005}, {"type": "ndcg_at_1000", "value": 43.15}, {"type": "ndcg_at_3", "value": 30.575000000000003}, {"type": "ndcg_at_5", "value": 32.668}, {"type": "precision_at_1", "value": 24.399}, {"type": "precision_at_10", "value": 5.656}, {"type": "precision_at_100", "value": 0.874}, {"type": "precision_at_1000", "value": 0.121}, {"type": "precision_at_3", "value": 13.062000000000001}, {"type": "precision_at_5", "value": 9.242}, {"type": "recall_at_1", "value": 22.039}, {"type": "recall_at_10", "value": 48.379}, {"type": "recall_at_100", "value": 71.11800000000001}, {"type": "recall_at_1000", "value": 91.095}, {"type": "recall_at_3", "value": 35.108}, {"type": "recall_at_5", "value": 40.015}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ClimateFEVER", "type": "climate-fever", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 10.144}, {"type": "map_at_10", "value": 18.238}, {"type": "map_at_100", "value": 20.143}, {"type": "map_at_1000", "value": 20.346}, {"type": "map_at_3", "value": 14.809}, {"type": "map_at_5", "value": 16.567999999999998}, {"type": "mrr_at_1", "value": 22.671}, {"type": "mrr_at_10", "value": 34.906}, {"type": "mrr_at_100", "value": 35.858000000000004}, {"type": "mrr_at_1000", "value": 35.898}, {"type": "mrr_at_3", "value": 31.238}, {"type": "mrr_at_5", "value": 33.342}, {"type": "ndcg_at_1", "value": 22.671}, {"type": "ndcg_at_10", "value": 26.540000000000003}, {"type": "ndcg_at_100", "value": 34.138000000000005}, {"type": "ndcg_at_1000", "value": 37.72}, {"type": "ndcg_at_3", "value": 20.766000000000002}, {"type": "ndcg_at_5", "value": 22.927}, {"type": "precision_at_1", "value": 22.671}, {"type": "precision_at_10", "value": 8.619}, {"type": "precision_at_100", "value": 1.678}, {"type": "precision_at_1000", "value": 0.23500000000000001}, {"type": "precision_at_3", "value": 15.592}, {"type": "precision_at_5", "value": 12.43}, {"type": "recall_at_1", "value": 10.144}, {"type": "recall_at_10", "value": 33.46}, {"type": "recall_at_100", "value": 59.758}, {"type": "recall_at_1000", "value": 79.704}, {"type": "recall_at_3", "value": 19.604}, {"type": "recall_at_5", "value": 25.367}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB DBPedia", "type": "dbpedia-entity", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 8.654}, {"type": "map_at_10", "value": 18.506}, {"type": "map_at_100", "value": 26.412999999999997}, {"type": "map_at_1000", "value": 28.13}, {"type": "map_at_3", "value": 13.379}, {"type": "map_at_5", "value": 15.529000000000002}, {"type": "mrr_at_1", "value": 66.0}, {"type": "mrr_at_10", "value": 74.13}, {"type": "mrr_at_100", "value": 74.48700000000001}, {"type": "mrr_at_1000", "value": 74.49799999999999}, {"type": "mrr_at_3", "value": 72.75}, {"type": "mrr_at_5", "value": 73.762}, {"type": "ndcg_at_1", "value": 54.50000000000001}, {"type": "ndcg_at_10", "value": 40.236}, {"type": "ndcg_at_100", "value": 44.690999999999995}, {"type": "ndcg_at_1000", "value": 52.195}, {"type": "ndcg_at_3", "value": 45.632}, {"type": "ndcg_at_5", "value": 42.952}, {"type": "precision_at_1", "value": 66.0}, {"type": "precision_at_10", "value": 31.724999999999998}, {"type": "precision_at_100", "value": 10.299999999999999}, {"type": "precision_at_1000", "value": 2.194}, {"type": "precision_at_3", "value": 48.75}, {"type": "precision_at_5", "value": 41.6}, {"type": "recall_at_1", "value": 8.654}, {"type": "recall_at_10", "value": 23.74}, {"type": "recall_at_100", "value": 50.346999999999994}, {"type": "recall_at_1000", "value": 74.376}, {"type": "recall_at_3", "value": 14.636}, {"type": "recall_at_5", "value": 18.009}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB EmotionClassification", "type": "mteb/emotion", "config": "default", "split": "test", "revision": "4f58c6b202a23cf9a4da393831edf4f9183cad37"}, "metrics": [{"type": "accuracy", "value": 53.245}, {"type": "f1", "value": 48.74520523753552}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FEVER", "type": "fever", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 51.729}, {"type": "map_at_10", "value": 63.904}, {"type": "map_at_100", "value": 64.363}, {"type": "map_at_1000", "value": 64.38199999999999}, {"type": "map_at_3", "value": 61.393}, {"type": "map_at_5", "value": 63.02100000000001}, {"type": "mrr_at_1", "value": 55.686}, {"type": "mrr_at_10", "value": 67.804}, {"type": "mrr_at_100", "value": 68.15299999999999}, {"type": "mrr_at_1000", "value": 68.161}, {"type": "mrr_at_3", "value": 65.494}, {"type": "mrr_at_5", "value": 67.01599999999999}, {"type": "ndcg_at_1", "value": 55.686}, {"type": "ndcg_at_10", "value": 70.025}, {"type": "ndcg_at_100", "value": 72.011}, {"type": "ndcg_at_1000", "value": 72.443}, {"type": "ndcg_at_3", "value": 65.32900000000001}, {"type": "ndcg_at_5", "value": 68.05600000000001}, {"type": "precision_at_1", "value": 55.686}, {"type": "precision_at_10", "value": 9.358}, {"type": "precision_at_100", "value": 1.05}, {"type": "precision_at_1000", "value": 0.11}, {"type": "precision_at_3", "value": 26.318}, {"type": "precision_at_5", "value": 17.321}, {"type": "recall_at_1", "value": 51.729}, {"type": "recall_at_10", "value": 85.04}, {"type": "recall_at_100", "value": 93.777}, {"type": "recall_at_1000", "value": 96.824}, {"type": "recall_at_3", "value": 72.521}, {"type": "recall_at_5", "value": 79.148}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FiQA2018", "type": "fiqa", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 23.765}, {"type": "map_at_10", "value": 39.114}, {"type": "map_at_100", "value": 40.987}, {"type": "map_at_1000", "value": 41.155}, {"type": "map_at_3", "value": 34.028000000000006}, {"type": "map_at_5", "value": 36.925000000000004}, {"type": "mrr_at_1", "value": 46.451}, {"type": "mrr_at_10", "value": 54.711}, {"type": "mrr_at_100", "value": 55.509}, {"type": "mrr_at_1000", "value": 55.535000000000004}, {"type": "mrr_at_3", "value": 52.649}, {"type": "mrr_at_5", "value": 53.729000000000006}, {"type": "ndcg_at_1", "value": 46.451}, {"type": "ndcg_at_10", "value": 46.955999999999996}, {"type": "ndcg_at_100", "value": 53.686}, {"type": "ndcg_at_1000", "value": 56.230000000000004}, {"type": "ndcg_at_3", "value": 43.374}, {"type": "ndcg_at_5", "value": 44.372}, {"type": "precision_at_1", "value": 46.451}, {"type": "precision_at_10", "value": 13.256}, {"type": "precision_at_100", "value": 2.019}, {"type": "precision_at_1000", "value": 0.247}, {"type": "precision_at_3", "value": 29.115000000000002}, {"type": "precision_at_5", "value": 21.389}, {"type": "recall_at_1", "value": 23.765}, {"type": "recall_at_10", "value": 53.452999999999996}, {"type": "recall_at_100", "value": 78.828}, {"type": "recall_at_1000", "value": 93.938}, {"type": "recall_at_3", "value": 39.023}, {"type": "recall_at_5", "value": 45.18}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB HotpotQA", "type": "hotpotqa", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 31.918000000000003}, {"type": "map_at_10", "value": 46.741}, {"type": "map_at_100", "value": 47.762}, {"type": "map_at_1000", "value": 47.849000000000004}, {"type": "map_at_3", "value": 43.578}, {"type": "map_at_5", "value": 45.395}, {"type": "mrr_at_1", "value": 63.834999999999994}, {"type": "mrr_at_10", "value": 71.312}, {"type": "mrr_at_100", "value": 71.695}, {"type": "mrr_at_1000", "value": 71.714}, {"type": "mrr_at_3", "value": 69.82000000000001}, {"type": "mrr_at_5", "value": 70.726}, {"type": "ndcg_at_1", "value": 63.834999999999994}, {"type": "ndcg_at_10", "value": 55.879999999999995}, {"type": "ndcg_at_100", "value": 59.723000000000006}, {"type": "ndcg_at_1000", "value": 61.49400000000001}, {"type": "ndcg_at_3", "value": 50.964}, {"type": "ndcg_at_5", "value": 53.47}, {"type": "precision_at_1", "value": 63.834999999999994}, {"type": "precision_at_10", "value": 11.845}, {"type": "precision_at_100", "value": 1.4869999999999999}, {"type": "precision_at_1000", "value": 0.172}, {"type": "precision_at_3", "value": 32.158}, {"type": "precision_at_5", "value": 21.278}, {"type": "recall_at_1", "value": 31.918000000000003}, {"type": "recall_at_10", "value": 59.223000000000006}, {"type": "recall_at_100", "value": 74.328}, {"type": "recall_at_1000", "value": 86.05000000000001}, {"type": "recall_at_3", "value": 48.238}, {"type": "recall_at_5", "value": 53.193999999999996}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ImdbClassification", "type": "mteb/imdb", "config": "default", "split": "test", "revision": "3d86128a09e091d6018b6d26cad27f2739fc2db7"}, "metrics": [{"type": "accuracy", "value": 79.7896}, {"type": "ap", "value": 73.65166029460288}, {"type": "f1", "value": 79.71794693711813}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MSMARCO", "type": "msmarco", "config": "default", "split": "dev", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 22.239}, {"type": "map_at_10", "value": 34.542}, {"type": "map_at_100", "value": 35.717999999999996}, {"type": "map_at_1000", "value": 35.764}, {"type": "map_at_3", "value": 30.432}, {"type": "map_at_5", "value": 32.81}, {"type": "mrr_at_1", "value": 22.908}, {"type": "mrr_at_10", "value": 35.127}, {"type": "mrr_at_100", "value": 36.238}, {"type": "mrr_at_1000", "value": 36.278}, {"type": "mrr_at_3", "value": 31.076999999999998}, {"type": "mrr_at_5", "value": 33.419}, {"type": "ndcg_at_1", "value": 22.908}, {"type": "ndcg_at_10", "value": 41.607}, {"type": "ndcg_at_100", "value": 47.28}, {"type": "ndcg_at_1000", "value": 48.414}, {"type": "ndcg_at_3", "value": 33.253}, {"type": "ndcg_at_5", "value": 37.486000000000004}, {"type": "precision_at_1", "value": 22.908}, {"type": "precision_at_10", "value": 6.645}, {"type": "precision_at_100", "value": 0.9490000000000001}, {"type": "precision_at_1000", "value": 0.105}, {"type": "precision_at_3", "value": 14.130999999999998}, {"type": "precision_at_5", "value": 10.616}, {"type": "recall_at_1", "value": 22.239}, {"type": "recall_at_10", "value": 63.42}, {"type": "recall_at_100", "value": 89.696}, {"type": "recall_at_1000", "value": 98.351}, {"type": "recall_at_3", "value": 40.77}, {"type": "recall_at_5", "value": 50.93}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPDomainClassification (en)", "type": "mteb/mtop_domain", "config": "en", "split": "test", "revision": "d80d48c1eb48d3562165c59d59d0034df9fff0bf"}, "metrics": [{"type": "accuracy", "value": 95.06839945280439}, {"type": "f1", "value": 94.74276398224072}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPIntentClassification (en)", "type": "mteb/mtop_intent", "config": "en", "split": "test", "revision": "ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba"}, "metrics": [{"type": "accuracy", "value": 72.25718194254446}, {"type": "f1", "value": 53.91164489161391}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (en)", "type": "mteb/amazon_massive_intent", "config": "en", "split": "test", "revision": "31efe3c427b0bae9c22cbb560b8f15491cc6bed7"}, "metrics": [{"type": "accuracy", "value": 71.47948890383323}, {"type": "f1", "value": 69.98520247230257}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (en)", "type": "mteb/amazon_massive_scenario", "config": "en", "split": "test", "revision": "7d571f92784cd94a019292a1f45445077d0ef634"}, "metrics": [{"type": "accuracy", "value": 76.46603900470748}, {"type": "f1", "value": 76.44111526065399}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringP2P", "type": "mteb/medrxiv-clustering-p2p", "config": "default", "split": "test", "revision": "e7a26af6f3ae46b30dde8737f02c07b1505bcc73"}, "metrics": [{"type": "v_measure", "value": 33.19106070798198}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringS2S", "type": "mteb/medrxiv-clustering-s2s", "config": "default", "split": "test", "revision": "35191c8c0dca72d8ff3efcd72aa802307d469663"}, "metrics": [{"type": "v_measure", "value": 30.78772205248094}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB MindSmallReranking", "type": "mteb/mind_small", "config": "default", "split": "test", "revision": "3bdac13927fdc888b903db93b2ffdbd90b295a69"}, "metrics": [{"type": "map", "value": 31.811231631488507}, {"type": "mrr", "value": 32.98200485378021}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NFCorpus", "type": "nfcorpus", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 6.9}, {"type": "map_at_10", "value": 13.703000000000001}, {"type": "map_at_100", "value": 17.251}, {"type": "map_at_1000", "value": 18.795}, {"type": "map_at_3", "value": 10.366999999999999}, {"type": "map_at_5", "value": 11.675}, {"type": "mrr_at_1", "value": 47.059}, {"type": "mrr_at_10", "value": 55.816}, {"type": "mrr_at_100", "value": 56.434}, {"type": "mrr_at_1000", "value": 56.467}, {"type": "mrr_at_3", "value": 53.973000000000006}, {"type": "mrr_at_5", "value": 55.257999999999996}, {"type": "ndcg_at_1", "value": 44.737}, {"type": "ndcg_at_10", "value": 35.997}, {"type": "ndcg_at_100", "value": 33.487}, {"type": "ndcg_at_1000", "value": 41.897}, {"type": "ndcg_at_3", "value": 41.18}, {"type": "ndcg_at_5", "value": 38.721}, {"type": "precision_at_1", "value": 46.129999999999995}, {"type": "precision_at_10", "value": 26.533}, {"type": "precision_at_100", "value": 8.706}, {"type": "precision_at_1000", "value": 2.16}, {"type": "precision_at_3", "value": 38.493}, {"type": "precision_at_5", "value": 33.189}, {"type": "recall_at_1", "value": 6.9}, {"type": "recall_at_10", "value": 17.488999999999997}, {"type": "recall_at_100", "value": 34.583000000000006}, {"type": "recall_at_1000", "value": 64.942}, {"type": "recall_at_3", "value": 11.494}, {"type": "recall_at_5", "value": 13.496}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NQ", "type": "nq", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 33.028999999999996}, {"type": "map_at_10", "value": 49.307}, {"type": "map_at_100", "value": 50.205}, {"type": "map_at_1000", "value": 50.23}, {"type": "map_at_3", "value": 44.782}, {"type": "map_at_5", "value": 47.599999999999994}, {"type": "mrr_at_1", "value": 37.108999999999995}, {"type": "mrr_at_10", "value": 51.742999999999995}, {"type": "mrr_at_100", "value": 52.405}, {"type": "mrr_at_1000", "value": 52.422000000000004}, {"type": "mrr_at_3", "value": 48.087999999999994}, {"type": "mrr_at_5", "value": 50.414}, {"type": "ndcg_at_1", "value": 37.08}, {"type": "ndcg_at_10", "value": 57.236}, {"type": "ndcg_at_100", "value": 60.931999999999995}, {"type": "ndcg_at_1000", "value": 61.522}, {"type": "ndcg_at_3", "value": 48.93}, {"type": "ndcg_at_5", "value": 53.561}, {"type": "precision_at_1", "value": 37.08}, {"type": "precision_at_10", "value": 9.386}, {"type": "precision_at_100", "value": 1.1480000000000001}, {"type": "precision_at_1000", "value": 0.12}, {"type": "precision_at_3", "value": 22.258}, {"type": "precision_at_5", "value": 16.025}, {"type": "recall_at_1", "value": 33.028999999999996}, {"type": "recall_at_10", "value": 78.805}, {"type": "recall_at_100", "value": 94.643}, {"type": "recall_at_1000", "value": 99.039}, {"type": "recall_at_3", "value": 57.602}, {"type": "recall_at_5", "value": 68.253}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB QuoraRetrieval", "type": "quora", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 71.122}, {"type": "map_at_10", "value": 85.237}, {"type": "map_at_100", "value": 85.872}, {"type": "map_at_1000", "value": 85.885}, {"type": "map_at_3", "value": 82.27499999999999}, {"type": "map_at_5", "value": 84.13199999999999}, {"type": "mrr_at_1", "value": 81.73}, {"type": "mrr_at_10", "value": 87.834}, {"type": "mrr_at_100", "value": 87.92}, {"type": "mrr_at_1000", "value": 87.921}, {"type": "mrr_at_3", "value": 86.878}, {"type": "mrr_at_5", "value": 87.512}, {"type": "ndcg_at_1", "value": 81.73}, {"type": "ndcg_at_10", "value": 88.85499999999999}, {"type": "ndcg_at_100", "value": 89.992}, {"type": "ndcg_at_1000", "value": 90.07}, {"type": "ndcg_at_3", "value": 85.997}, {"type": "ndcg_at_5", "value": 87.55199999999999}, {"type": "precision_at_1", "value": 81.73}, {"type": "precision_at_10", "value": 13.491}, {"type": "precision_at_100", "value": 1.536}, {"type": "precision_at_1000", "value": 0.157}, {"type": "precision_at_3", "value": 37.623}, {"type": "precision_at_5", "value": 24.742}, {"type": "recall_at_1", "value": 71.122}, {"type": "recall_at_10", "value": 95.935}, {"type": "recall_at_100", "value": 99.657}, {"type": "recall_at_1000", "value": 99.996}, {"type": "recall_at_3", "value": 87.80799999999999}, {"type": "recall_at_5", "value": 92.161}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClustering", "type": "mteb/reddit-clustering", "config": "default", "split": "test", "revision": "24640382cdbf8abc73003fb0fa6d111a705499eb"}, "metrics": [{"type": "v_measure", "value": 63.490029238193756}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClusteringP2P", "type": "mteb/reddit-clustering-p2p", "config": "default", "split": "test", "revision": "282350215ef01743dc01b456c7f5241fa8937f16"}, "metrics": [{"type": "v_measure", "value": 65.13153408508836}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SCIDOCS", "type": "scidocs", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 4.202999999999999}, {"type": "map_at_10", "value": 10.174}, {"type": "map_at_100", "value": 12.138}, {"type": "map_at_1000", "value": 12.418}, {"type": "map_at_3", "value": 7.379}, {"type": "map_at_5", "value": 8.727}, {"type": "mrr_at_1", "value": 20.7}, {"type": "mrr_at_10", "value": 30.389}, {"type": "mrr_at_100", "value": 31.566}, {"type": "mrr_at_1000", "value": 31.637999999999998}, {"type": "mrr_at_3", "value": 27.133000000000003}, {"type": "mrr_at_5", "value": 29.078}, {"type": "ndcg_at_1", "value": 20.7}, {"type": "ndcg_at_10", "value": 17.355999999999998}, {"type": "ndcg_at_100", "value": 25.151}, {"type": "ndcg_at_1000", "value": 30.37}, {"type": "ndcg_at_3", "value": 16.528000000000002}, {"type": "ndcg_at_5", "value": 14.396999999999998}, {"type": "precision_at_1", "value": 20.7}, {"type": "precision_at_10", "value": 8.98}, {"type": "precision_at_100", "value": 2.015}, {"type": "precision_at_1000", "value": 0.327}, {"type": "precision_at_3", "value": 15.367}, {"type": "precision_at_5", "value": 12.559999999999999}, {"type": "recall_at_1", "value": 4.202999999999999}, {"type": "recall_at_10", "value": 18.197}, {"type": "recall_at_100", "value": 40.903}, {"type": "recall_at_1000", "value": 66.427}, {"type": "recall_at_3", "value": 9.362}, {"type": "recall_at_5", "value": 12.747}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB SICK-R", "type": "mteb/sickr-sts", "config": "default", "split": "test", "revision": "a6ea5a8cab320b040a23452cc28066d9beae2cee"}, "metrics": [{"type": "cos_sim_spearman", "value": 81.69890989765257}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS12", "type": "mteb/sts12-sts", "config": "default", "split": "test", "revision": "a0d554a64d88156834ff5ae9920b964011b16384"}, "metrics": [{"type": "cos_sim_spearman", "value": 75.31953790551489}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS13", "type": "mteb/sts13-sts", "config": "default", "split": "test", "revision": "7e90230a92c190f1bf69ae9002b8cea547a64cca"}, "metrics": [{"type": "cos_sim_spearman", "value": 87.44050861280759}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS14", "type": "mteb/sts14-sts", "config": "default", "split": "test", "revision": "6031580fec1f6af667f0bd2da0a551cf4f0b2375"}, "metrics": [{"type": "cos_sim_spearman", "value": 81.86922869270393}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS15", "type": "mteb/sts15-sts", "config": "default", "split": "test", "revision": "ae752c7c21bf194d8b67fd573edf7ae58183cbe3"}, "metrics": [{"type": "cos_sim_spearman", "value": 88.9399170304284}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS16", "type": "mteb/sts16-sts", "config": "default", "split": "test", "revision": "4d8694f8f0e0100860b497b999b3dbed754a0513"}, "metrics": [{"type": "cos_sim_spearman", "value": 85.38015314088582}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS17 (en-en)", "type": "mteb/sts17-crosslingual-sts", "config": "en-en", "split": "test", "revision": "af5e6fb845001ecf41f4c1e033ce921939a2a68d"}, "metrics": [{"type": "cos_sim_spearman", "value": 90.53653527788835}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (en)", "type": "mteb/sts22-crosslingual-sts", "config": "en", "split": "test", "revision": "6d1ba47164174a496b7fa5d3569dae26a6813b80"}, "metrics": [{"type": "cos_sim_spearman", "value": 68.64526474250209}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSBenchmark", "type": "mteb/stsbenchmark-sts", "config": "default", "split": "test", "revision": "b0fddb56ed78048fa8b90373c8a3cfc37b684831"}, "metrics": [{"type": "cos_sim_spearman", "value": 86.56156983963042}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB SciDocsRR", "type": "mteb/scidocs-reranking", "config": "default", "split": "test", "revision": "d3c5e1fc0b855ab6097bf1cda04dd73947d7caab"}, "metrics": [{"type": "map", "value": 79.48610254648003}, {"type": "mrr", "value": 94.02481505422682}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SciFact", "type": "scifact", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 48.983}, {"type": "map_at_10", "value": 59.077999999999996}, {"type": "map_at_100", "value": 59.536}, {"type": "map_at_1000", "value": 59.575}, {"type": "map_at_3", "value": 55.691}, {"type": "map_at_5", "value": 57.410000000000004}, {"type": "mrr_at_1", "value": 51.666999999999994}, {"type": "mrr_at_10", "value": 60.427}, {"type": "mrr_at_100", "value": 60.763}, {"type": "mrr_at_1000", "value": 60.79900000000001}, {"type": "mrr_at_3", "value": 57.556}, {"type": "mrr_at_5", "value": 59.089000000000006}, {"type": "ndcg_at_1", "value": 51.666999999999994}, {"type": "ndcg_at_10", "value": 64.559}, {"type": "ndcg_at_100", "value": 66.58}, {"type": "ndcg_at_1000", "value": 67.64}, {"type": "ndcg_at_3", "value": 58.287}, {"type": "ndcg_at_5", "value": 61.001000000000005}, {"type": "precision_at_1", "value": 51.666999999999994}, {"type": "precision_at_10", "value": 9.067}, {"type": "precision_at_100", "value": 1.0170000000000001}, {"type": "precision_at_1000", "value": 0.11100000000000002}, {"type": "precision_at_3", "value": 23.0}, {"type": "precision_at_5", "value": 15.6}, {"type": "recall_at_1", "value": 48.983}, {"type": "recall_at_10", "value": 80.289}, {"type": "recall_at_100", "value": 89.43299999999999}, {"type": "recall_at_1000", "value": 97.667}, {"type": "recall_at_3", "value": 62.978}, {"type": "recall_at_5", "value": 69.872}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB SprintDuplicateQuestions", "type": "mteb/sprintduplicatequestions-pairclassification", "config": "default", "split": "test", "revision": "d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46"}, "metrics": [{"type": "cos_sim_accuracy", "value": 99.79009900990098}, {"type": "cos_sim_ap", "value": 94.94115052608419}, {"type": "cos_sim_f1", "value": 89.1260162601626}, {"type": "cos_sim_precision", "value": 90.599173553719}, {"type": "cos_sim_recall", "value": 87.7}, {"type": "dot_accuracy", "value": 99.79009900990098}, {"type": "dot_ap", "value": 94.94115052608419}, {"type": "dot_f1", "value": 89.1260162601626}, {"type": "dot_precision", "value": 90.599173553719}, {"type": "dot_recall", "value": 87.7}, {"type": "euclidean_accuracy", "value": 99.79009900990098}, {"type": "euclidean_ap", "value": 94.94115052608419}, {"type": "euclidean_f1", "value": 89.1260162601626}, {"type": "euclidean_precision", "value": 90.599173553719}, {"type": "euclidean_recall", "value": 87.7}, {"type": "manhattan_accuracy", "value": 99.7940594059406}, {"type": "manhattan_ap", "value": 94.95271414642431}, {"type": "manhattan_f1", "value": 89.24508790072387}, {"type": "manhattan_precision", "value": 92.3982869379015}, {"type": "manhattan_recall", "value": 86.3}, {"type": "max_accuracy", "value": 99.7940594059406}, {"type": "max_ap", "value": 94.95271414642431}, {"type": "max_f1", "value": 89.24508790072387}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClustering", "type": "mteb/stackexchange-clustering", "config": "default", "split": "test", "revision": "6cbc1f7b2bc0622f2e39d2c77fa502909748c259"}, "metrics": [{"type": "v_measure", "value": 68.43866571935851}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClusteringP2P", "type": "mteb/stackexchange-clustering-p2p", "config": "default", "split": "test", "revision": "815ca46b2622cec33ccafc3735d572c266efdb44"}, "metrics": [{"type": "v_measure", "value": 35.16579026551532}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB StackOverflowDupQuestions", "type": "mteb/stackoverflowdupquestions-reranking", "config": "default", "split": "test", "revision": "e185fbe320c72810689fc5848eb6114e1ef5ec69"}, "metrics": [{"type": "map", "value": 52.518952473513934}, {"type": "mrr", "value": 53.292457134368895}]}, {"task": {"type": "Summarization"}, "dataset": {"name": "MTEB SummEval", "type": "mteb/summeval", "config": "default", "split": "test", "revision": "cda12ad7615edc362dbf25a00fdd61d3b1eaf93c"}, "metrics": [{"type": "cos_sim_pearson", "value": 31.12529588316604}, {"type": "cos_sim_spearman", "value": 32.31662126895294}, {"type": "dot_pearson", "value": 31.125303796647056}, {"type": "dot_spearman", "value": 32.31662126895294}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB TRECCOVID", "type": "trec-covid", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 0.219}, {"type": "map_at_10", "value": 1.7469999999999999}, {"type": "map_at_100", "value": 10.177999999999999}, {"type": "map_at_1000", "value": 26.108999999999998}, {"type": "map_at_3", "value": 0.64}, {"type": "map_at_5", "value": 0.968}, {"type": "mrr_at_1", "value": 82.0}, {"type": "mrr_at_10", "value": 89.067}, {"type": "mrr_at_100", "value": 89.067}, {"type": "mrr_at_1000", "value": 89.067}, {"type": "mrr_at_3", "value": 88.333}, {"type": "mrr_at_5", "value": 88.73299999999999}, {"type": "ndcg_at_1", "value": 78.0}, {"type": "ndcg_at_10", "value": 71.398}, {"type": "ndcg_at_100", "value": 55.574999999999996}, {"type": "ndcg_at_1000", "value": 51.771}, {"type": "ndcg_at_3", "value": 77.765}, {"type": "ndcg_at_5", "value": 73.614}, {"type": "precision_at_1", "value": 82.0}, {"type": "precision_at_10", "value": 75.4}, {"type": "precision_at_100", "value": 58.040000000000006}, {"type": "precision_at_1000", "value": 23.516000000000002}, {"type": "precision_at_3", "value": 84.0}, {"type": "precision_at_5", "value": 78.4}, {"type": "recall_at_1", "value": 0.219}, {"type": "recall_at_10", "value": 1.958}, {"type": "recall_at_100", "value": 13.797999999999998}, {"type": "recall_at_1000", "value": 49.881}, {"type": "recall_at_3", "value": 0.672}, {"type": "recall_at_5", "value": 1.0370000000000001}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB Touche2020", "type": "webis-touche2020", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 1.8610000000000002}, {"type": "map_at_10", "value": 8.705}, {"type": "map_at_100", "value": 15.164}, {"type": "map_at_1000", "value": 16.78}, {"type": "map_at_3", "value": 4.346}, {"type": "map_at_5", "value": 6.151}, {"type": "mrr_at_1", "value": 22.448999999999998}, {"type": "mrr_at_10", "value": 41.556}, {"type": "mrr_at_100", "value": 42.484}, {"type": "mrr_at_1000", "value": 42.494}, {"type": "mrr_at_3", "value": 37.755}, {"type": "mrr_at_5", "value": 40.102}, {"type": "ndcg_at_1", "value": 21.429000000000002}, {"type": "ndcg_at_10", "value": 23.439}, {"type": "ndcg_at_100", "value": 36.948}, {"type": "ndcg_at_1000", "value": 48.408}, {"type": "ndcg_at_3", "value": 22.261}, {"type": "ndcg_at_5", "value": 23.085}, {"type": "precision_at_1", "value": 22.448999999999998}, {"type": "precision_at_10", "value": 21.633}, {"type": "precision_at_100", "value": 8.02}, {"type": "precision_at_1000", "value": 1.5939999999999999}, {"type": "precision_at_3", "value": 23.810000000000002}, {"type": "precision_at_5", "value": 24.490000000000002}, {"type": "recall_at_1", "value": 1.8610000000000002}, {"type": "recall_at_10", "value": 15.876000000000001}, {"type": "recall_at_100", "value": 50.300999999999995}, {"type": "recall_at_1000", "value": 86.098}, {"type": "recall_at_3", "value": 5.892}, {"type": "recall_at_5", "value": 9.443}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ToxicConversationsClassification", "type": "mteb/toxic_conversations_50k", "config": "default", "split": "test", "revision": "d7c0de2777da35d6aae2200a62c6e0e5af397c4c"}, "metrics": [{"type": "accuracy", "value": 70.3264}, {"type": "ap", "value": 13.249577616243794}, {"type": "f1", "value": 53.621518367695685}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB TweetSentimentExtractionClassification", "type": "mteb/tweet_sentiment_extraction", "config": "default", "split": "test", "revision": "d604517c81ca91fe16a244d1248fc021f9ecee7a"}, "metrics": [{"type": "accuracy", "value": 61.57611771363894}, {"type": "f1", "value": 61.79797478568639}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB TwentyNewsgroupsClustering", "type": "mteb/twentynewsgroups-clustering", "config": "default", "split": "test", "revision": "6125ec4e24fa026cec8a478383ee943acfbd5449"}, "metrics": [{"type": "v_measure", "value": 53.38315344479284}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterSemEval2015", "type": "mteb/twittersemeval2015-pairclassification", "config": "default", "split": "test", "revision": "70970daeab8776df92f5ea462b6173c0b46fd2d1"}, "metrics": [{"type": "cos_sim_accuracy", "value": 87.55438993860642}, {"type": "cos_sim_ap", "value": 77.98702600017738}, {"type": "cos_sim_f1", "value": 71.94971653931476}, {"type": "cos_sim_precision", "value": 67.50693802035153}, {"type": "cos_sim_recall", "value": 77.01846965699208}, {"type": "dot_accuracy", "value": 87.55438993860642}, {"type": "dot_ap", "value": 77.98702925907986}, {"type": "dot_f1", "value": 71.94971653931476}, {"type": "dot_precision", "value": 67.50693802035153}, {"type": "dot_recall", "value": 77.01846965699208}, {"type": "euclidean_accuracy", "value": 87.55438993860642}, {"type": "euclidean_ap", "value": 77.98702951957925}, {"type": "euclidean_f1", "value": 71.94971653931476}, {"type": "euclidean_precision", "value": 67.50693802035153}, {"type": "euclidean_recall", "value": 77.01846965699208}, {"type": "manhattan_accuracy", "value": 87.54246885617214}, {"type": "manhattan_ap", "value": 77.95531413902947}, {"type": "manhattan_f1", "value": 71.93605683836589}, {"type": "manhattan_precision", "value": 69.28152492668622}, {"type": "manhattan_recall", "value": 74.80211081794195}, {"type": "max_accuracy", "value": 87.55438993860642}, {"type": "max_ap", "value": 77.98702951957925}, {"type": "max_f1", "value": 71.94971653931476}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterURLCorpus", "type": "mteb/twitterurlcorpus-pairclassification", "config": "default", "split": "test", "revision": "8b6510b0b1fa4e4c4f879467980e9be563ec1cdf"}, "metrics": [{"type": "cos_sim_accuracy", "value": 89.47296930182016}, {"type": "cos_sim_ap", "value": 86.92853616302108}, {"type": "cos_sim_f1", "value": 79.35138351681047}, {"type": "cos_sim_precision", "value": 76.74820143884892}, {"type": "cos_sim_recall", "value": 82.13735756082538}, {"type": "dot_accuracy", "value": 89.47296930182016}, {"type": "dot_ap", "value": 86.92854339601595}, {"type": "dot_f1", "value": 79.35138351681047}, {"type": "dot_precision", "value": 76.74820143884892}, {"type": "dot_recall", "value": 82.13735756082538}, {"type": "euclidean_accuracy", "value": 89.47296930182016}, {"type": "euclidean_ap", "value": 86.92854191061649}, {"type": "euclidean_f1", "value": 79.35138351681047}, {"type": "euclidean_precision", "value": 76.74820143884892}, {"type": "euclidean_recall", "value": 82.13735756082538}, {"type": "manhattan_accuracy", "value": 89.47685023479644}, {"type": "manhattan_ap", "value": 86.90063722679578}, {"type": "manhattan_f1", "value": 79.30753865502702}, {"type": "manhattan_precision", "value": 76.32066068631639}, {"type": "manhattan_recall", "value": 82.53772713273791}, {"type": "max_accuracy", "value": 89.47685023479644}, {"type": "max_ap", "value": 86.92854339601595}, {"type": "max_f1", "value": 79.35138351681047}]}]}]} |
Shashwat13333/bge-base-en-v1.5 | Shashwat13333 | sentence-similarity | [
"sentence-transformers",
"safetensors",
"bert",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:150",
"loss:MatryoshkaLoss",
"loss:MultipleNegativesRankingLoss",
"en",
"arxiv:1908.10084",
"arxiv:2205.13147",
"arxiv:1705.00652",
"base_model:BAAI/bge-base-en-v1.5",
"base_model:finetune:BAAI/bge-base-en-v1.5",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
]
| 2025-02-03T12:34:28 | 2025-02-03T13:56:03 | 4 | 0 | ---
base_model: BAAI/bge-base-en-v1.5
language:
- en
library_name: sentence-transformers
license: apache-2.0
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:150
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: What services does Techchefz Digital offer for AI adoption?
sentences:
- 'We are a New breed of innovative digital transformation agency, redefining storytelling
for an always-on world.
With roots dating back to 2017, we started as a pocket size team of enthusiasts
with a goal of helping traditional businesses transform and create dynamic, digital
cultures through disruptive strategies and agile deployment of innovative solutions.'
- "At Techchefz Digital, we specialize in guiding companies through the complexities\
\ of adopting and integrating Artificial Intelligence and Machine Learning technologies.\
\ Our consultancy services are designed to enhance your operational efficiency\
\ and decision-making capabilities across all sectors. With a global network of\
\ AI/ML experts and a commitment to excellence, we are your partners in transforming\
\ innovative possibilities into real-world achievements. \
\ \
\ \n DATA INTELLIGENCE PLATFORMS we\
\ specialize in\nTensorFlow\nDatabricks\nTableau\nPytorch\nOpenAI\nPinecone\""
- 'How can we get started with your DevOps solutions?
Getting started is easy. Contact us through our website. We''ll schedule a consultation
to discuss your needs, evaluate your current infrastructure, and propose a customized
DevOps solution designed to achieve your goals.'
- source_sentence: Hav you made any services for schools and students?
sentences:
- 'How do we do Custom Development ?
We follow below process to develop custom web or mobile Application on Agile Methodology,
breaking requirements in pieces and developing and shipping them with considering
utmost quality:
Requirements Analysis
We begin by understanding the client's needs and objectives for the website.
Identify key features, functionality, and any specific design preferences.
Project Planning
Then create a detailed project plan outlining the scope, timeline, and milestones.
Define the technology stack and development tools suitable for the project.
User Experience Design
Then comes the stage of Developing wireframes or prototypes to visualize the website's
structure and layout. We create a custom design that aligns with the brand identity
and user experience goals.
Development
After getting Sign-off on Design from Client, we break the requirements into Sprints
on Agile Methodology, and start developing them.'
- 'This is our Portfolio
Introducing the world of Housing Finance& Banking Firm.
Corporate Website with 10 regional languages in India with analytics and user
personalization and Dashboard for Regional Managers, Sales Agents, etc. to manage
the Builder Requests, approve/deny Properties, manage visits and appointments,
manage leads, etc.
Introducing the world of Global Automotive Brand.We have implemented a Multi Locale
Multilingual Omnichannel platform for Royal Enfield. The platform supports public
websites, customer portals, internal portals, business applications for over 35+
different locations all over the world.
Developed Digital Platform for Students, Guardians, Teachers, Tutors, with AI/ML
in collaboration with Successive Technologies Inc, USA. Cloud, Dev-Sec-Ops &
Data Governance
Managing cloud provisioning and modernization alongside automated infrastructure,
event-driven microservices, containerization, DevOps, cybersecurity, and 24x7
monitoring support ensures efficient, secure, and responsive IT operations.'
- "SERVICES WE PROVIDE\nFlexible engagement models tailored to your needs\nWe specialize\
\ in comprehensive website audits that provide valuable insights and recommendations\
\ to enhance your online presence.\nDigital Strategy & Consulting\nCreating digital\
\ roadmap that transform your digital enterprise and produce a return on investment,\
\ basis our discovery framework, brainstorming sessions & current state analysis.\n\
\nPlatform Selection\nHelping you select the optimal digital experience, commerce,\
\ cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying\
\ next-gen scalable and agile enterprise digital platforms, along with multi-platform\
\ integrations. \nProduct Builds\nHelp you ideate, strategize, and engineer\
\ your product with help of our enterprise frameworks\nInfrastructure\nSpecialize\
\ in multi-cloud infrastructure helping you put forward the right cloud infrastructure\
\ and optimization strategy.\n\nManaged Services\nOperate and monitor your business-critical\
\ applications, data, and IT workloads, along with Application maintenance and\
\ operations.\nTeam Augmentation\nHelp you scale up and augment your existing\
\ team to solve your hiring challenges with our easy to deploy staff augmentation\
\ offerings.\""
- source_sentence: How did TechChefz evolve from its early days?
sentences:
- 'Why do we need Microservices ?
Instead of building a monolithic application where all functionalities are tightly
integrated, microservices break down the system into modular and loosely coupled
services.
Scalability
Flexibility and Agility
Resilience and Fault Isolation
Technology Diversity
Continuous Delivery'
- 'After a transformative scuba dive in the Maldives, Mayank Maggon made a pivotal
decision to depart from the corporate ladder in December 2016. Fueled by a clear
vision to revolutionize the digital landscape, Mayank set out to leverage the
best technology ingredients, crafting custom applications and digital ecosystems
tailored to clients'' specific needs, limitations, and budgets.
However, this solo journey was not without its challenges. Mayank had to initiate
the revenue engine by offering corporate trainings and conducting online batches
for tech training across the USA. He also undertook small projects and subcontracted
modules of larger projects for clients in the US, UK, and India. It was only after
this initial groundwork that Mayank was able to hire a group of interns, whom
he meticulously trained and groomed to prepare them for handling Enterprise Level
Applications. This journey reflects Mayank''s resilience, determination, and entrepreneurial
spirit in building TechChefz Digital from the ground up.
With a passion for innovation and a relentless drive for excellence, Mayank has
steered TechChefz Digital through strategic partnerships, groundbreaking projects,
and exponential growth. His leadership has been instrumental in shaping TechChefz
Digital into a leading force in the digital transformation arena, inspiring a
culture of innovation and excellence that continues to propel the company forward.'
- 'In what ways can machine learning optimize our operations?
Machine learning algorithms can analyze operational data to identify inefficiencies,
predict maintenance needs, optimize supply chains, and automate repetitive tasks,
significantly improving operational efficiency and reducing costs.'
- source_sentence: What kind of data do you leverage for AI solutions?
sentences:
- 'In the Introducing the world of Global Insurance Firm, we crafted Effective Solutions
for Complex Problems and delieverd a comprehensive Website Development, Production
Support & Managed Services, we optimized customer journeys, integrate analytics,
CRM, ERP, and third-party applications, and implement cutting-edge technologies
for enhanced performance and efficiency
and achievied 200% Reduction in operational time & effort managing content & experience,
70% Reduction in Deployment Errors and Downtime, 2.5X Customer Engagement, Conversion
& Retention'
- 'Our Solutions
Strategy & Digital Transformation
Innovate via digital transformation, modernize tech, craft product strategies,
enhance customer experiences, optimize data analytics, transition to cloud for
growth and efficiency
Product Engineering & Custom Development
Providing product development, enterprise web and mobile development, microservices
integrations, quality engineering, and application support services to drive innovation
and enhance operational efficiency.'
- Our AI/ML services pave the way for transformative change across industries, embodying
a client-focused approach that integrates seamlessly with human-centric innovation.
Our collaborative teams are dedicated to fostering growth, leveraging data, and
harnessing the predictive power of artificial intelligence to forge the next wave
of software excellence. We don't just deliver AI; we deliver the future.
- source_sentence: What managed services does TechChefz provide ?
sentences:
- " What we do\n\nDigital Strategy\nCreating digital frameworks that transform\
\ your digital enterprise and produce a return on investment.\n\nPlatform Selection\n\
Helping you select the optimal digital experience, commerce, cloud and marketing\
\ platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable\
\ and agile enterprise digital platforms, along with multi-platform integrations.\n\
\nProduct Builds\nHelp you ideate, strategize, and engineer your product with\
\ help of our enterprise frameworks \n\nTeam Augmentation\nHelp you scale up and\
\ augment your existing team to solve your hiring challenges with our easy to\
\ deploy staff augmentation offerings .\nManaged Services\nOperate and monitor\
\ your business-critical applications, data, and IT workloads, along with Application\
\ maintenance and operations\n"
- 'What makes your DevOps solutions stand out from the competition?
Our DevOps solutions stand out due to our personalized approach, extensive expertise,
and commitment to innovation. We focus on delivering measurable results, such
as reduced deployment times, improved system reliability, and enhanced security,
ensuring you get the maximum benefit from our services.'
- 'Introducing the world of General Insurance Firm
In this project, we implemented Digital Solution and Implementation with Headless
Drupal as the CMS, and lightweight React JS (Next JS SSR on Node JS) with the
following features:
PWA & AMP based Web Pages
Page Speed Optimization
Reusable and scalable React JS / Next JS Templates and Components
Headless Drupal CMS with Content & Experience management, approval workflows,
etc for seamless collaboration between the business and marketing teams
Minimalistic Buy and Renewal Journeys for various products, with API integrations
and adherence to data compliances
We achieved 250% Reduction in Operational Time and Effort in managing the Content
& Experience for Buy & renew Journeys,220% Reduction in Customer Drops during
buy and renewal journeys, 300% Reduction in bounce rate on policy landing and
campaign pages'
model-index:
- name: BGE base Financial Matryoshka
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 768
type: dim_768
metrics:
- type: cosine_accuracy@1
value: 0.17333333333333334
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.5466666666666666
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.6
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.6933333333333334
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.17333333333333334
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.1822222222222222
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.12
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.06933333333333333
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.17333333333333334
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.5466666666666666
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.6
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.6933333333333334
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.43705488094312567
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.3539576719576719
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.3663753684578632
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 512
type: dim_512
metrics:
- type: cosine_accuracy@1
value: 0.17333333333333334
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.5333333333333333
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.6266666666666667
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.6933333333333334
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.17333333333333334
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.17777777777777776
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.12533333333333332
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.06933333333333333
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.17333333333333334
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.5333333333333333
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.6266666666666667
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.6933333333333334
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.43324477959330543
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.3495185185185184
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.359896266319179
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 256
type: dim_256
metrics:
- type: cosine_accuracy@1
value: 0.22666666666666666
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.49333333333333335
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.56
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.68
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.22666666666666666
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.16444444444444445
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.11199999999999997
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.06799999999999998
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.22666666666666666
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.49333333333333335
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.56
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.68
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.4383628839300849
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.36210582010582004
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.3731640827722892
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 128
type: dim_128
metrics:
- type: cosine_accuracy@1
value: 0.24
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.48
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.56
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.6933333333333334
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.24
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.16
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.11199999999999997
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.06933333333333332
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.24
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.48
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.56
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.6933333333333334
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.4443870388298522
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.36651322751322746
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.37546675549059694
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 64
type: dim_64
metrics:
- type: cosine_accuracy@1
value: 0.08
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.3466666666666667
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.49333333333333335
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.56
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.08
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.11555555555555555
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.09866666666666667
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.05599999999999999
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.08
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.3466666666666667
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.49333333333333335
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.56
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.3120295466486537
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.23260846560846554
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.24731947636993173
name: Cosine Map@100
---
# BGE base Financial Matryoshka
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) <!-- at revision a5beb1e3e68b9ab74eb54cfd186867f64f240e1a -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Shashwat13333/bge-base-en-v1.5")
# Run inference
sentences = [
'What managed services does TechChefz provide ?',
' What we do\n\nDigital Strategy\nCreating digital frameworks that transform your digital enterprise and produce a return on investment.\n\nPlatform Selection\nHelping you select the optimal digital experience, commerce, cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable and agile enterprise digital platforms, along with multi-platform integrations.\n\nProduct Builds\nHelp you ideate, strategize, and engineer your product with help of our enterprise frameworks \n\nTeam Augmentation\nHelp you scale up and augment your existing team to solve your hiring challenges with our easy to deploy staff augmentation offerings .\nManaged Services\nOperate and monitor your business-critical applications, data, and IT workloads, along with Application maintenance and operations\n',
'Introducing the world of General Insurance Firm\nIn this project, we implemented Digital Solution and Implementation with Headless Drupal as the CMS, and lightweight React JS (Next JS SSR on Node JS) with the following features:\nPWA & AMP based Web Pages\nPage Speed Optimization\nReusable and scalable React JS / Next JS Templates and Components\nHeadless Drupal CMS with Content & Experience management, approval workflows, etc for seamless collaboration between the business and marketing teams\nMinimalistic Buy and Renewal Journeys for various products, with API integrations and adherence to data compliances\n\nWe achieved 250% Reduction in Operational Time and Effort in managing the Content & Experience for Buy & renew Journeys,220% Reduction in Customer Drops during buy and renewal journeys, 300% Reduction in bounce rate on policy landing and campaign pages',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Information Retrieval
* Datasets: `dim_768`, `dim_512`, `dim_256`, `dim_128` and `dim_64`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | dim_768 | dim_512 | dim_256 | dim_128 | dim_64 |
|:--------------------|:-----------|:-----------|:-----------|:-----------|:----------|
| cosine_accuracy@1 | 0.1733 | 0.1733 | 0.2267 | 0.24 | 0.08 |
| cosine_accuracy@3 | 0.5467 | 0.5333 | 0.4933 | 0.48 | 0.3467 |
| cosine_accuracy@5 | 0.6 | 0.6267 | 0.56 | 0.56 | 0.4933 |
| cosine_accuracy@10 | 0.6933 | 0.6933 | 0.68 | 0.6933 | 0.56 |
| cosine_precision@1 | 0.1733 | 0.1733 | 0.2267 | 0.24 | 0.08 |
| cosine_precision@3 | 0.1822 | 0.1778 | 0.1644 | 0.16 | 0.1156 |
| cosine_precision@5 | 0.12 | 0.1253 | 0.112 | 0.112 | 0.0987 |
| cosine_precision@10 | 0.0693 | 0.0693 | 0.068 | 0.0693 | 0.056 |
| cosine_recall@1 | 0.1733 | 0.1733 | 0.2267 | 0.24 | 0.08 |
| cosine_recall@3 | 0.5467 | 0.5333 | 0.4933 | 0.48 | 0.3467 |
| cosine_recall@5 | 0.6 | 0.6267 | 0.56 | 0.56 | 0.4933 |
| cosine_recall@10 | 0.6933 | 0.6933 | 0.68 | 0.6933 | 0.56 |
| **cosine_ndcg@10** | **0.4371** | **0.4332** | **0.4384** | **0.4444** | **0.312** |
| cosine_mrr@10 | 0.354 | 0.3495 | 0.3621 | 0.3665 | 0.2326 |
| cosine_map@100 | 0.3664 | 0.3599 | 0.3732 | 0.3755 | 0.2473 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 150 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 150 samples:
| | anchor | positive |
|:--------|:---------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 7 tokens</li><li>mean: 12.4 tokens</li><li>max: 20 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 126.17 tokens</li><li>max: 378 tokens</li></ul> |
* Samples:
| anchor | positive |
|:--------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Is it hard to move old systems to the cloud?</code> | <code>We offer custom software development, digital marketing strategies, and tailored solutions to drive tangible results for your business. Our expert team combines technical prowess with industry insights to propel your business forward in the digital landscape.<br><br>"Engage, analyze & target your customers<br>Digital transformation enables you to interact with customers across multiple channels, providing personalized experiences. This could include social media engagement, interactive websites, and mobile apps." "Empower your employees & partners<br>The push for digital transformation has led many companies to embrace cloud solutions. However, the migration and integration of legacy systems into the cloud often present challenges." "Optimize & automate your operations<br>The push for digital transformation has led many companies to embrace cloud solutions. However, the migration and integration of legacy systems into the cloud often present challenges." "Transform your products<br>The push for digi...</code> |
| <code>What benefits does marketing automation offer for time management?</code> | <code>Our MarTech capabilities<br><br>Personalization<br>Involves tailoring marketing messages and experiences to individual customers. It enhances customer engagement, loyalty, and ultimately, conversion rates.<br><br>Marketing Automation<br>Marketing automation streamlines repetitive tasks such as email marketing, lead nurturing, and social media posting. It improves efficiency, saves time, and ensures timely communication with customers.<br><br>Customer Relationship Management<br>CRM systems help manage interactions with current and potential customers. They store customer data, track interactions, and facilitate communication, improving customer retention.</code> |
| <code>How can your recommendation engines improve our business?</code> | <code>How can your recommendation engines improve our business?<br>Our recommendation engines are designed to analyze customer behavior and preferences to deliver personalized suggestions, enhancing user experience, increasing sales, and boosting customer retention.</code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `gradient_accumulation_steps`: 4
- `learning_rate`: 1e-05
- `weight_decay`: 0.01
- `num_train_epochs`: 4
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `fp16`: True
- `load_best_model_at_end`: True
- `optim`: adamw_torch_fused
- `push_to_hub`: True
- `hub_model_id`: Shashwat13333/bge-base-en-v1.5
- `push_to_hub_model_id`: bge-base-en-v1.5
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: epoch
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 8
- `per_device_eval_batch_size`: 8
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 4
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 1e-05
- `weight_decay`: 0.01
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 4
- `max_steps`: -1
- `lr_scheduler_type`: cosine
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: True
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch_fused
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: True
- `resume_from_checkpoint`: None
- `hub_model_id`: Shashwat13333/bge-base-en-v1.5
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: bge-base-en-v1.5
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | dim_768_cosine_ndcg@10 | dim_512_cosine_ndcg@10 | dim_256_cosine_ndcg@10 | dim_128_cosine_ndcg@10 | dim_64_cosine_ndcg@10 |
|:----------:|:------:|:-------------:|:----------------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|
| 0.2105 | 1 | 4.4608 | - | - | - | - | - |
| 0.8421 | 4 | - | 0.3891 | 0.3727 | 0.4175 | 0.3876 | 0.2956 |
| 1.2105 | 5 | 4.2215 | - | - | - | - | - |
| 1.8421 | 8 | - | 0.4088 | 0.4351 | 0.4034 | 0.4052 | 0.3167 |
| 2.4211 | 10 | 3.397 | - | - | - | - | - |
| 2.8421 | 12 | - | 0.4440 | 0.4252 | 0.4133 | 0.4284 | 0.3024 |
| 3.6316 | 15 | 2.87 | - | - | - | - | - |
| **3.8421** | **16** | **-** | **0.4371** | **0.4332** | **0.4384** | **0.4444** | **0.312** |
* The bold row denotes the saved checkpoint.
### Framework Versions
- Python: 3.11.11
- Sentence Transformers: 3.3.1
- Transformers: 4.47.1
- PyTorch: 2.5.1+cu124
- Accelerate: 1.2.1
- Datasets: 3.2.0
- Tokenizers: 0.21.0
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MatryoshkaLoss
```bibtex
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | [
"TEXT_CLASSIFICATION"
]
| [
"CRAFT"
]
| Non_BioNLP |
# BGE base Financial Matryoshka
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) <!-- at revision a5beb1e3e68b9ab74eb54cfd186867f64f240e1a -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Shashwat13333/bge-base-en-v1.5")
# Run inference
sentences = [
'What managed services does TechChefz provide ?',
' What we do\n\nDigital Strategy\nCreating digital frameworks that transform your digital enterprise and produce a return on investment.\n\nPlatform Selection\nHelping you select the optimal digital experience, commerce, cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable and agile enterprise digital platforms, along with multi-platform integrations.\n\nProduct Builds\nHelp you ideate, strategize, and engineer your product with help of our enterprise frameworks \n\nTeam Augmentation\nHelp you scale up and augment your existing team to solve your hiring challenges with our easy to deploy staff augmentation offerings .\nManaged Services\nOperate and monitor your business-critical applications, data, and IT workloads, along with Application maintenance and operations\n',
'Introducing the world of General Insurance Firm\nIn this project, we implemented Digital Solution and Implementation with Headless Drupal as the CMS, and lightweight React JS (Next JS SSR on Node JS) with the following features:\nPWA & AMP based Web Pages\nPage Speed Optimization\nReusable and scalable React JS / Next JS Templates and Components\nHeadless Drupal CMS with Content & Experience management, approval workflows, etc for seamless collaboration between the business and marketing teams\nMinimalistic Buy and Renewal Journeys for various products, with API integrations and adherence to data compliances\n\nWe achieved 250% Reduction in Operational Time and Effort in managing the Content & Experience for Buy & renew Journeys,220% Reduction in Customer Drops during buy and renewal journeys, 300% Reduction in bounce rate on policy landing and campaign pages',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Information Retrieval
* Datasets: `dim_768`, `dim_512`, `dim_256`, `dim_128` and `dim_64`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | dim_768 | dim_512 | dim_256 | dim_128 | dim_64 |
|:--------------------|:-----------|:-----------|:-----------|:-----------|:----------|
| cosine_accuracy@1 | 0.1733 | 0.1733 | 0.2267 | 0.24 | 0.08 |
| cosine_accuracy@3 | 0.5467 | 0.5333 | 0.4933 | 0.48 | 0.3467 |
| cosine_accuracy@5 | 0.6 | 0.6267 | 0.56 | 0.56 | 0.4933 |
| cosine_accuracy@10 | 0.6933 | 0.6933 | 0.68 | 0.6933 | 0.56 |
| cosine_precision@1 | 0.1733 | 0.1733 | 0.2267 | 0.24 | 0.08 |
| cosine_precision@3 | 0.1822 | 0.1778 | 0.1644 | 0.16 | 0.1156 |
| cosine_precision@5 | 0.12 | 0.1253 | 0.112 | 0.112 | 0.0987 |
| cosine_precision@10 | 0.0693 | 0.0693 | 0.068 | 0.0693 | 0.056 |
| cosine_recall@1 | 0.1733 | 0.1733 | 0.2267 | 0.24 | 0.08 |
| cosine_recall@3 | 0.5467 | 0.5333 | 0.4933 | 0.48 | 0.3467 |
| cosine_recall@5 | 0.6 | 0.6267 | 0.56 | 0.56 | 0.4933 |
| cosine_recall@10 | 0.6933 | 0.6933 | 0.68 | 0.6933 | 0.56 |
| **cosine_ndcg@10** | **0.4371** | **0.4332** | **0.4384** | **0.4444** | **0.312** |
| cosine_mrr@10 | 0.354 | 0.3495 | 0.3621 | 0.3665 | 0.2326 |
| cosine_map@100 | 0.3664 | 0.3599 | 0.3732 | 0.3755 | 0.2473 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 150 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 150 samples:
| | anchor | positive |
|:--------|:---------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 7 tokens</li><li>mean: 12.4 tokens</li><li>max: 20 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 126.17 tokens</li><li>max: 378 tokens</li></ul> |
* Samples:
| anchor | positive |
|:--------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Is it hard to move old systems to the cloud?</code> | <code>We offer custom software development, digital marketing strategies, and tailored solutions to drive tangible results for your business. Our expert team combines technical prowess with industry insights to propel your business forward in the digital landscape.<br><br>"Engage, analyze & target your customers<br>Digital transformation enables you to interact with customers across multiple channels, providing personalized experiences. This could include social media engagement, interactive websites, and mobile apps." "Empower your employees & partners<br>The push for digital transformation has led many companies to embrace cloud solutions. However, the migration and integration of legacy systems into the cloud often present challenges." "Optimize & automate your operations<br>The push for digital transformation has led many companies to embrace cloud solutions. However, the migration and integration of legacy systems into the cloud often present challenges." "Transform your products<br>The push for digi...</code> |
| <code>What benefits does marketing automation offer for time management?</code> | <code>Our MarTech capabilities<br><br>Personalization<br>Involves tailoring marketing messages and experiences to individual customers. It enhances customer engagement, loyalty, and ultimately, conversion rates.<br><br>Marketing Automation<br>Marketing automation streamlines repetitive tasks such as email marketing, lead nurturing, and social media posting. It improves efficiency, saves time, and ensures timely communication with customers.<br><br>Customer Relationship Management<br>CRM systems help manage interactions with current and potential customers. They store customer data, track interactions, and facilitate communication, improving customer retention.</code> |
| <code>How can your recommendation engines improve our business?</code> | <code>How can your recommendation engines improve our business?<br>Our recommendation engines are designed to analyze customer behavior and preferences to deliver personalized suggestions, enhancing user experience, increasing sales, and boosting customer retention.</code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `gradient_accumulation_steps`: 4
- `learning_rate`: 1e-05
- `weight_decay`: 0.01
- `num_train_epochs`: 4
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `fp16`: True
- `load_best_model_at_end`: True
- `optim`: adamw_torch_fused
- `push_to_hub`: True
- `hub_model_id`: Shashwat13333/bge-base-en-v1.5
- `push_to_hub_model_id`: bge-base-en-v1.5
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: epoch
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 8
- `per_device_eval_batch_size`: 8
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 4
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 1e-05
- `weight_decay`: 0.01
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 4
- `max_steps`: -1
- `lr_scheduler_type`: cosine
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: True
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch_fused
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: True
- `resume_from_checkpoint`: None
- `hub_model_id`: Shashwat13333/bge-base-en-v1.5
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: bge-base-en-v1.5
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | dim_768_cosine_ndcg@10 | dim_512_cosine_ndcg@10 | dim_256_cosine_ndcg@10 | dim_128_cosine_ndcg@10 | dim_64_cosine_ndcg@10 |
|:----------:|:------:|:-------------:|:----------------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|
| 0.2105 | 1 | 4.4608 | - | - | - | - | - |
| 0.8421 | 4 | - | 0.3891 | 0.3727 | 0.4175 | 0.3876 | 0.2956 |
| 1.2105 | 5 | 4.2215 | - | - | - | - | - |
| 1.8421 | 8 | - | 0.4088 | 0.4351 | 0.4034 | 0.4052 | 0.3167 |
| 2.4211 | 10 | 3.397 | - | - | - | - | - |
| 2.8421 | 12 | - | 0.4440 | 0.4252 | 0.4133 | 0.4284 | 0.3024 |
| 3.6316 | 15 | 2.87 | - | - | - | - | - |
| **3.8421** | **16** | **-** | **0.4371** | **0.4332** | **0.4384** | **0.4444** | **0.312** |
* The bold row denotes the saved checkpoint.
### Framework Versions
- Python: 3.11.11
- Sentence Transformers: 3.3.1
- Transformers: 4.47.1
- PyTorch: 2.5.1+cu124
- Accelerate: 1.2.1
- Datasets: 3.2.0
- Tokenizers: 0.21.0
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MatryoshkaLoss
```bibtex
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | {"base_model": "BAAI/bge-base-en-v1.5", "language": ["en"], "library_name": "sentence-transformers", "license": "apache-2.0", "metrics": ["cosine_accuracy@1", "cosine_accuracy@3", "cosine_accuracy@5", "cosine_accuracy@10", "cosine_precision@1", "cosine_precision@3", "cosine_precision@5", "cosine_precision@10", "cosine_recall@1", "cosine_recall@3", "cosine_recall@5", "cosine_recall@10", "cosine_ndcg@10", "cosine_mrr@10", "cosine_map@100"], "pipeline_tag": "sentence-similarity", "tags": ["sentence-transformers", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:150", "loss:MatryoshkaLoss", "loss:MultipleNegativesRankingLoss"], "widget": [{"source_sentence": "What services does Techchefz Digital offer for AI adoption?", "sentences": ["We are a New breed of innovative digital transformation agency, redefining storytelling for an always-on world.\nWith roots dating back to 2017, we started as a pocket size team of enthusiasts with a goal of helping traditional businesses transform and create dynamic, digital cultures through disruptive strategies and agile deployment of innovative solutions.", "At Techchefz Digital, we specialize in guiding companies through the complexities of adopting and integrating Artificial Intelligence and Machine Learning technologies. Our consultancy services are designed to enhance your operational efficiency and decision-making capabilities across all sectors. With a global network of AI/ML experts and a commitment to excellence, we are your partners in transforming innovative possibilities into real-world achievements. \n DATA INTELLIGENCE PLATFORMS we specialize in\nTensorFlow\nDatabricks\nTableau\nPytorch\nOpenAI\nPinecone\"", "How can we get started with your DevOps solutions?\nGetting started is easy. Contact us through our website. We'll schedule a consultation to discuss your needs, evaluate your current infrastructure, and propose a customized DevOps solution designed to achieve your goals."]}, {"source_sentence": "Hav you made any services for schools and students?", "sentences": ["How do we do Custom Development ?\nWe follow below process to develop custom web or mobile Application on Agile Methodology, breaking requirements in pieces and developing and shipping them with considering utmost quality:\nRequirements Analysis\nWe begin by understanding the client's needs and objectives for the website. Identify key features, functionality, and any specific design preferences.\n\nProject Planning\nThen create a detailed project plan outlining the scope, timeline, and milestones. Define the technology stack and development tools suitable for the project.\n\nUser Experience Design\nThen comes the stage of Developing wireframes or prototypes to visualize the website's structure and layout. We create a custom design that aligns with the brand identity and user experience goals.\n\nDevelopment\nAfter getting Sign-off on Design from Client, we break the requirements into Sprints on Agile Methodology, and start developing them.", "This is our Portfolio\nIntroducing the world of Housing Finance& Banking Firm.\nCorporate Website with 10 regional languages in India with analytics and user personalization and Dashboard for Regional Managers, Sales Agents, etc. to manage the Builder Requests, approve/deny Properties, manage visits and appointments, manage leads, etc.\n\n\nIntroducing the world of Global Automotive Brand.We have implemented a Multi Locale Multilingual Omnichannel platform for Royal Enfield. The platform supports public websites, customer portals, internal portals, business applications for over 35+ different locations all over the world.\n\nDeveloped Digital Platform for Students, Guardians, Teachers, Tutors, with AI/ML in collaboration with Successive Technologies Inc, USA. Cloud, Dev-Sec-Ops & Data Governance\nManaging cloud provisioning and modernization alongside automated infrastructure, event-driven microservices, containerization, DevOps, cybersecurity, and 24x7 monitoring support ensures efficient, secure, and responsive IT operations.", "SERVICES WE PROVIDE\nFlexible engagement models tailored to your needs\nWe specialize in comprehensive website audits that provide valuable insights and recommendations to enhance your online presence.\nDigital Strategy & Consulting\nCreating digital roadmap that transform your digital enterprise and produce a return on investment, basis our discovery framework, brainstorming sessions & current state analysis.\n\nPlatform Selection\nHelping you select the optimal digital experience, commerce, cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable and agile enterprise digital platforms, along with multi-platform integrations. \nProduct Builds\nHelp you ideate, strategize, and engineer your product with help of our enterprise frameworks\nInfrastructure\nSpecialize in multi-cloud infrastructure helping you put forward the right cloud infrastructure and optimization strategy.\n\nManaged Services\nOperate and monitor your business-critical applications, data, and IT workloads, along with Application maintenance and operations.\nTeam Augmentation\nHelp you scale up and augment your existing team to solve your hiring challenges with our easy to deploy staff augmentation offerings.\""]}, {"source_sentence": "How did TechChefz evolve from its early days?", "sentences": ["Why do we need Microservices ?\nInstead of building a monolithic application where all functionalities are tightly integrated, microservices break down the system into modular and loosely coupled services.\n\nScalability\nFlexibility and Agility\nResilience and Fault Isolation\nTechnology Diversity\nContinuous Delivery", "After a transformative scuba dive in the Maldives, Mayank Maggon made a pivotal decision to depart from the corporate ladder in December 2016. Fueled by a clear vision to revolutionize the digital landscape, Mayank set out to leverage the best technology ingredients, crafting custom applications and digital ecosystems tailored to clients' specific needs, limitations, and budgets.\n\nHowever, this solo journey was not without its challenges. Mayank had to initiate the revenue engine by offering corporate trainings and conducting online batches for tech training across the USA. He also undertook small projects and subcontracted modules of larger projects for clients in the US, UK, and India. It was only after this initial groundwork that Mayank was able to hire a group of interns, whom he meticulously trained and groomed to prepare them for handling Enterprise Level Applications. This journey reflects Mayank's resilience, determination, and entrepreneurial spirit in building TechChefz Digital from the ground up.\n\nWith a passion for innovation and a relentless drive for excellence, Mayank has steered TechChefz Digital through strategic partnerships, groundbreaking projects, and exponential growth. His leadership has been instrumental in shaping TechChefz Digital into a leading force in the digital transformation arena, inspiring a culture of innovation and excellence that continues to propel the company forward.", "In what ways can machine learning optimize our operations?\nMachine learning algorithms can analyze operational data to identify inefficiencies, predict maintenance needs, optimize supply chains, and automate repetitive tasks, significantly improving operational efficiency and reducing costs."]}, {"source_sentence": "What kind of data do you leverage for AI solutions?", "sentences": ["In the Introducing the world of Global Insurance Firm, we crafted Effective Solutions for Complex Problems and delieverd a comprehensive Website Development, Production Support & Managed Services, we optimized customer journeys, integrate analytics, CRM, ERP, and third-party applications, and implement cutting-edge technologies for enhanced performance and efficiency\nand achievied 200% Reduction in operational time & effort managing content & experience, 70% Reduction in Deployment Errors and Downtime, 2.5X Customer Engagement, Conversion & Retention", "Our Solutions\nStrategy & Digital Transformation\nInnovate via digital transformation, modernize tech, craft product strategies, enhance customer experiences, optimize data analytics, transition to cloud for growth and efficiency\n\nProduct Engineering & Custom Development\nProviding product development, enterprise web and mobile development, microservices integrations, quality engineering, and application support services to drive innovation and enhance operational efficiency.", "Our AI/ML services pave the way for transformative change across industries, embodying a client-focused approach that integrates seamlessly with human-centric innovation. Our collaborative teams are dedicated to fostering growth, leveraging data, and harnessing the predictive power of artificial intelligence to forge the next wave of software excellence. We don't just deliver AI; we deliver the future."]}, {"source_sentence": "What managed services does TechChefz provide ?", "sentences": [" What we do\n\nDigital Strategy\nCreating digital frameworks that transform your digital enterprise and produce a return on investment.\n\nPlatform Selection\nHelping you select the optimal digital experience, commerce, cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable and agile enterprise digital platforms, along with multi-platform integrations.\n\nProduct Builds\nHelp you ideate, strategize, and engineer your product with help of our enterprise frameworks \n\nTeam Augmentation\nHelp you scale up and augment your existing team to solve your hiring challenges with our easy to deploy staff augmentation offerings .\nManaged Services\nOperate and monitor your business-critical applications, data, and IT workloads, along with Application maintenance and operations\n", "What makes your DevOps solutions stand out from the competition?\nOur DevOps solutions stand out due to our personalized approach, extensive expertise, and commitment to innovation. We focus on delivering measurable results, such as reduced deployment times, improved system reliability, and enhanced security, ensuring you get the maximum benefit from our services.", "Introducing the world of General Insurance Firm\nIn this project, we implemented Digital Solution and Implementation with Headless Drupal as the CMS, and lightweight React JS (Next JS SSR on Node JS) with the following features:\nPWA & AMP based Web Pages\nPage Speed Optimization\nReusable and scalable React JS / Next JS Templates and Components\nHeadless Drupal CMS with Content & Experience management, approval workflows, etc for seamless collaboration between the business and marketing teams\nMinimalistic Buy and Renewal Journeys for various products, with API integrations and adherence to data compliances\n\nWe achieved 250% Reduction in Operational Time and Effort in managing the Content & Experience for Buy & renew Journeys,220% Reduction in Customer Drops during buy and renewal journeys, 300% Reduction in bounce rate on policy landing and campaign pages"]}], "model-index": [{"name": "BGE base Financial Matryoshka", "results": [{"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "dim 768", "type": "dim_768"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.17333333333333334, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.5466666666666666, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.6, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.6933333333333334, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.17333333333333334, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.1822222222222222, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.12, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.06933333333333333, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.17333333333333334, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.5466666666666666, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.6, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.6933333333333334, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.43705488094312567, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.3539576719576719, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.3663753684578632, "name": "Cosine Map@100"}]}, {"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "dim 512", "type": "dim_512"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.17333333333333334, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.5333333333333333, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.6266666666666667, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.6933333333333334, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.17333333333333334, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.17777777777777776, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.12533333333333332, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.06933333333333333, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.17333333333333334, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.5333333333333333, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.6266666666666667, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.6933333333333334, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.43324477959330543, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.3495185185185184, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.359896266319179, "name": "Cosine Map@100"}]}, {"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "dim 256", "type": "dim_256"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.22666666666666666, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.49333333333333335, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.56, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.68, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.22666666666666666, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.16444444444444445, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.11199999999999997, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.06799999999999998, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.22666666666666666, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.49333333333333335, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.56, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.68, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.4383628839300849, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.36210582010582004, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.3731640827722892, "name": "Cosine Map@100"}]}, {"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "dim 128", "type": "dim_128"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.24, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.48, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.56, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.6933333333333334, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.24, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.16, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.11199999999999997, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.06933333333333332, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.24, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.48, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.56, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.6933333333333334, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.4443870388298522, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.36651322751322746, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.37546675549059694, "name": "Cosine Map@100"}]}, {"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "dim 64", "type": "dim_64"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.08, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.3466666666666667, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.49333333333333335, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.56, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.08, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.11555555555555555, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.09866666666666667, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.05599999999999999, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.08, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.3466666666666667, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.49333333333333335, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.56, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.3120295466486537, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.23260846560846554, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.24731947636993173, "name": "Cosine Map@100"}]}]}]} |
BSC-NLP4BIA/biomedical-term-classifier-setfit | BSC-NLP4BIA | text-classification | [
"sentence-transformers",
"pytorch",
"roberta",
"setfit",
"text-classification",
"bert",
"biomedical",
"lexical semantics",
"bionlp",
"es",
"license:apache-2.0",
"region:us"
]
| 2024-05-22T15:47:16 | 2024-05-22T16:34:40 | 21 | 0 | ---
language:
- es
license: apache-2.0
pipeline_tag: text-classification
tags:
- setfit
- sentence-transformers
- text-classification
- bert
- biomedical
- lexical semantics
- bionlp
---
# Biomedical term classifier with SetFit in Spanish
## Table of contents
<details>
<summary>Click to expand</summary>
- [Model description](#model-description)
- [Intended uses and limitations](#intended-use)
- [How to use](#how-to-use)
- [Training](#training)
- [Evaluation](#evaluation)
- [Additional information](#additional-information)
- [Author](#author)
- [Licensing information](#licensing-information)
- [Citation information](#citation-information)
- [Disclaimer](#disclaimer)
</details>
## Model description
This is a [SetFit model](https://github.com/huggingface/setfit) trained for multilabel biomedical text classification in Spanish.
## Intended uses and limitations
The model is prepared to classify medical entities among 21 classes, including diseases, medical procedures, symptoms, and drugs, among others. It still lacks some classes like body structures.
## How to use
This model is implemented as part of the KeyCARE library. Install first the keycare module to call the SetFit classifier:
```bash
python -m pip install keycare
```
You can then run the KeyCARE pipeline that uses the SetFit model:
```python
from keycare install TermExtractor.TermExtractor
# initialize the termextractor object
termextractor = TermExtractor()
# Run the pipeline
text = """Acude al Servicio de Urgencias por cefalea frontoparietal derecha.
Mediante biopsia se diagnostica adenocarcinoma de próstata Gleason 4+4=8 con metástasis óseas múltiples.
Se trata con Ácido Zoledrónico 4 mg iv/4 semanas.
"""
termextractor(text)
# You can also access the class storing the SetFit model
categorizer = termextractor.categorizer
```
## Training
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. The used pre-trained model is SapBERT-from-roberta-base-biomedical-clinical-es from the BSC-NLP4BIA reserch group.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
The training data has been obtained from NER Gold Standard Corpora also generated by BSC-NLP4BIA, including [MedProcNER](https://temu.bsc.es/medprocner/), [DISTEMIST](https://temu.bsc.es/distemist/), [SympTEMIST](https://temu.bsc.es/symptemist/), [CANTEMIST](https://temu.bsc.es/cantemist/), and [PharmaCoNER](https://temu.bsc.es/pharmaconer/), among others.
## Evaluation
To be published
## Additional information
### Author
NLP4BIA at the Barcelona Supercomputing Center
### Licensing information
[Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0)
### Citation information
To be published
### Disclaimer
<details>
<summary>Click to expand</summary>
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.
</details> | [
"TEXT_CLASSIFICATION"
]
| [
"CANTEMIST",
"DISTEMIST",
"PHARMACONER",
"SYMPTEMIST"
]
| BioNLP |
# Biomedical term classifier with SetFit in Spanish
## Table of contents
<details>
<summary>Click to expand</summary>
- [Model description](#model-description)
- [Intended uses and limitations](#intended-use)
- [How to use](#how-to-use)
- [Training](#training)
- [Evaluation](#evaluation)
- [Additional information](#additional-information)
- [Author](#author)
- [Licensing information](#licensing-information)
- [Citation information](#citation-information)
- [Disclaimer](#disclaimer)
</details>
## Model description
This is a [SetFit model](https://github.com/huggingface/setfit) trained for multilabel biomedical text classification in Spanish.
## Intended uses and limitations
The model is prepared to classify medical entities among 21 classes, including diseases, medical procedures, symptoms, and drugs, among others. It still lacks some classes like body structures.
## How to use
This model is implemented as part of the KeyCARE library. Install first the keycare module to call the SetFit classifier:
```bash
python -m pip install keycare
```
You can then run the KeyCARE pipeline that uses the SetFit model:
```python
from keycare install TermExtractor.TermExtractor
# initialize the termextractor object
termextractor = TermExtractor()
# Run the pipeline
text = """Acude al Servicio de Urgencias por cefalea frontoparietal derecha.
Mediante biopsia se diagnostica adenocarcinoma de próstata Gleason 4+4=8 con metástasis óseas múltiples.
Se trata con Ácido Zoledrónico 4 mg iv/4 semanas.
"""
termextractor(text)
# You can also access the class storing the SetFit model
categorizer = termextractor.categorizer
```
## Training
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. The used pre-trained model is SapBERT-from-roberta-base-biomedical-clinical-es from the BSC-NLP4BIA reserch group.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
The training data has been obtained from NER Gold Standard Corpora also generated by BSC-NLP4BIA, including [MedProcNER](https://temu.bsc.es/medprocner/), [DISTEMIST](https://temu.bsc.es/distemist/), [SympTEMIST](https://temu.bsc.es/symptemist/), [CANTEMIST](https://temu.bsc.es/cantemist/), and [PharmaCoNER](https://temu.bsc.es/pharmaconer/), among others.
## Evaluation
To be published
## Additional information
### Author
NLP4BIA at the Barcelona Supercomputing Center
### Licensing information
[Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0)
### Citation information
To be published
### Disclaimer
<details>
<summary>Click to expand</summary>
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.
</details> | {"language": ["es"], "license": "apache-2.0", "pipeline_tag": "text-classification", "tags": ["setfit", "sentence-transformers", "text-classification", "bert", "biomedical", "lexical semantics", "bionlp"]} |
mav23/gte-Qwen2-1.5B-instruct-GGUF | mav23 | sentence-similarity | [
"sentence-transformers",
"gguf",
"mteb",
"transformers",
"Qwen2",
"sentence-similarity",
"arxiv:2308.03281",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us",
"conversational"
]
| 2024-10-11T14:04:27 | 2024-10-11T14:18:45 | 631 | 2 | ---
license: apache-2.0
tags:
- mteb
- sentence-transformers
- transformers
- Qwen2
- sentence-similarity
model-index:
- name: gte-qwen2-7B-instruct
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 83.98507462686567
- type: ap
value: 50.93015252587014
- type: f1
value: 78.50416599051215
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 96.61065
- type: ap
value: 94.89174052954196
- type: f1
value: 96.60942596940565
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 55.614000000000004
- type: f1
value: 54.90553480294904
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: mteb/arguana
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 45.164
- type: map_at_10
value: 61.519
- type: map_at_100
value: 61.769
- type: map_at_1000
value: 61.769
- type: map_at_3
value: 57.443999999999996
- type: map_at_5
value: 60.058
- type: mrr_at_1
value: 46.088
- type: mrr_at_10
value: 61.861
- type: mrr_at_100
value: 62.117999999999995
- type: mrr_at_1000
value: 62.117999999999995
- type: mrr_at_3
value: 57.729
- type: mrr_at_5
value: 60.392
- type: ndcg_at_1
value: 45.164
- type: ndcg_at_10
value: 69.72
- type: ndcg_at_100
value: 70.719
- type: ndcg_at_1000
value: 70.719
- type: ndcg_at_3
value: 61.517999999999994
- type: ndcg_at_5
value: 66.247
- type: precision_at_1
value: 45.164
- type: precision_at_10
value: 9.545
- type: precision_at_100
value: 0.996
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 24.443
- type: precision_at_5
value: 16.97
- type: recall_at_1
value: 45.164
- type: recall_at_10
value: 95.448
- type: recall_at_100
value: 99.644
- type: recall_at_1000
value: 99.644
- type: recall_at_3
value: 73.329
- type: recall_at_5
value: 84.851
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 50.511868162026175
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 45.007803189284004
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 64.55292107723382
- type: mrr
value: 77.66158818097877
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 85.65459047085452
- type: cos_sim_spearman
value: 82.10729255710761
- type: euclidean_pearson
value: 82.78079159312476
- type: euclidean_spearman
value: 80.50002701880933
- type: manhattan_pearson
value: 82.41372641383016
- type: manhattan_spearman
value: 80.57412509272639
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 87.30844155844156
- type: f1
value: 87.25307322443255
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 43.20754608934859
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 38.818037697335505
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 35.423
- type: map_at_10
value: 47.198
- type: map_at_100
value: 48.899
- type: map_at_1000
value: 49.004
- type: map_at_3
value: 43.114999999999995
- type: map_at_5
value: 45.491
- type: mrr_at_1
value: 42.918
- type: mrr_at_10
value: 53.299
- type: mrr_at_100
value: 54.032000000000004
- type: mrr_at_1000
value: 54.055
- type: mrr_at_3
value: 50.453
- type: mrr_at_5
value: 52.205999999999996
- type: ndcg_at_1
value: 42.918
- type: ndcg_at_10
value: 53.98
- type: ndcg_at_100
value: 59.57
- type: ndcg_at_1000
value: 60.879000000000005
- type: ndcg_at_3
value: 48.224000000000004
- type: ndcg_at_5
value: 50.998
- type: precision_at_1
value: 42.918
- type: precision_at_10
value: 10.299999999999999
- type: precision_at_100
value: 1.687
- type: precision_at_1000
value: 0.211
- type: precision_at_3
value: 22.842000000000002
- type: precision_at_5
value: 16.681
- type: recall_at_1
value: 35.423
- type: recall_at_10
value: 66.824
- type: recall_at_100
value: 89.564
- type: recall_at_1000
value: 97.501
- type: recall_at_3
value: 50.365
- type: recall_at_5
value: 57.921
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 33.205
- type: map_at_10
value: 44.859
- type: map_at_100
value: 46.135
- type: map_at_1000
value: 46.259
- type: map_at_3
value: 41.839
- type: map_at_5
value: 43.662
- type: mrr_at_1
value: 41.146
- type: mrr_at_10
value: 50.621
- type: mrr_at_100
value: 51.207
- type: mrr_at_1000
value: 51.246
- type: mrr_at_3
value: 48.535000000000004
- type: mrr_at_5
value: 49.818
- type: ndcg_at_1
value: 41.146
- type: ndcg_at_10
value: 50.683
- type: ndcg_at_100
value: 54.82
- type: ndcg_at_1000
value: 56.69
- type: ndcg_at_3
value: 46.611000000000004
- type: ndcg_at_5
value: 48.66
- type: precision_at_1
value: 41.146
- type: precision_at_10
value: 9.439
- type: precision_at_100
value: 1.465
- type: precision_at_1000
value: 0.194
- type: precision_at_3
value: 22.59
- type: precision_at_5
value: 15.86
- type: recall_at_1
value: 33.205
- type: recall_at_10
value: 61.028999999999996
- type: recall_at_100
value: 78.152
- type: recall_at_1000
value: 89.59700000000001
- type: recall_at_3
value: 49.05
- type: recall_at_5
value: 54.836
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 41.637
- type: map_at_10
value: 55.162
- type: map_at_100
value: 56.142
- type: map_at_1000
value: 56.188
- type: map_at_3
value: 51.564
- type: map_at_5
value: 53.696
- type: mrr_at_1
value: 47.524
- type: mrr_at_10
value: 58.243
- type: mrr_at_100
value: 58.879999999999995
- type: mrr_at_1000
value: 58.9
- type: mrr_at_3
value: 55.69499999999999
- type: mrr_at_5
value: 57.284
- type: ndcg_at_1
value: 47.524
- type: ndcg_at_10
value: 61.305
- type: ndcg_at_100
value: 65.077
- type: ndcg_at_1000
value: 65.941
- type: ndcg_at_3
value: 55.422000000000004
- type: ndcg_at_5
value: 58.516
- type: precision_at_1
value: 47.524
- type: precision_at_10
value: 9.918000000000001
- type: precision_at_100
value: 1.276
- type: precision_at_1000
value: 0.13899999999999998
- type: precision_at_3
value: 24.765
- type: precision_at_5
value: 17.204
- type: recall_at_1
value: 41.637
- type: recall_at_10
value: 76.185
- type: recall_at_100
value: 92.149
- type: recall_at_1000
value: 98.199
- type: recall_at_3
value: 60.856
- type: recall_at_5
value: 68.25099999999999
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 26.27
- type: map_at_10
value: 37.463
- type: map_at_100
value: 38.434000000000005
- type: map_at_1000
value: 38.509
- type: map_at_3
value: 34.226
- type: map_at_5
value: 36.161
- type: mrr_at_1
value: 28.588
- type: mrr_at_10
value: 39.383
- type: mrr_at_100
value: 40.23
- type: mrr_at_1000
value: 40.281
- type: mrr_at_3
value: 36.422
- type: mrr_at_5
value: 38.252
- type: ndcg_at_1
value: 28.588
- type: ndcg_at_10
value: 43.511
- type: ndcg_at_100
value: 48.274
- type: ndcg_at_1000
value: 49.975
- type: ndcg_at_3
value: 37.319
- type: ndcg_at_5
value: 40.568
- type: precision_at_1
value: 28.588
- type: precision_at_10
value: 6.893000000000001
- type: precision_at_100
value: 0.9900000000000001
- type: precision_at_1000
value: 0.117
- type: precision_at_3
value: 16.347
- type: precision_at_5
value: 11.661000000000001
- type: recall_at_1
value: 26.27
- type: recall_at_10
value: 60.284000000000006
- type: recall_at_100
value: 81.902
- type: recall_at_1000
value: 94.43
- type: recall_at_3
value: 43.537
- type: recall_at_5
value: 51.475
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 18.168
- type: map_at_10
value: 28.410000000000004
- type: map_at_100
value: 29.78
- type: map_at_1000
value: 29.892999999999997
- type: map_at_3
value: 25.238
- type: map_at_5
value: 26.96
- type: mrr_at_1
value: 23.507
- type: mrr_at_10
value: 33.382
- type: mrr_at_100
value: 34.404
- type: mrr_at_1000
value: 34.467999999999996
- type: mrr_at_3
value: 30.637999999999998
- type: mrr_at_5
value: 32.199
- type: ndcg_at_1
value: 23.507
- type: ndcg_at_10
value: 34.571000000000005
- type: ndcg_at_100
value: 40.663
- type: ndcg_at_1000
value: 43.236000000000004
- type: ndcg_at_3
value: 29.053
- type: ndcg_at_5
value: 31.563999999999997
- type: precision_at_1
value: 23.507
- type: precision_at_10
value: 6.654
- type: precision_at_100
value: 1.113
- type: precision_at_1000
value: 0.146
- type: precision_at_3
value: 14.427999999999999
- type: precision_at_5
value: 10.498000000000001
- type: recall_at_1
value: 18.168
- type: recall_at_10
value: 48.443000000000005
- type: recall_at_100
value: 74.47
- type: recall_at_1000
value: 92.494
- type: recall_at_3
value: 33.379999999999995
- type: recall_at_5
value: 39.76
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 32.39
- type: map_at_10
value: 44.479
- type: map_at_100
value: 45.977000000000004
- type: map_at_1000
value: 46.087
- type: map_at_3
value: 40.976
- type: map_at_5
value: 43.038
- type: mrr_at_1
value: 40.135
- type: mrr_at_10
value: 50.160000000000004
- type: mrr_at_100
value: 51.052
- type: mrr_at_1000
value: 51.087
- type: mrr_at_3
value: 47.818
- type: mrr_at_5
value: 49.171
- type: ndcg_at_1
value: 40.135
- type: ndcg_at_10
value: 50.731
- type: ndcg_at_100
value: 56.452000000000005
- type: ndcg_at_1000
value: 58.123000000000005
- type: ndcg_at_3
value: 45.507
- type: ndcg_at_5
value: 48.11
- type: precision_at_1
value: 40.135
- type: precision_at_10
value: 9.192
- type: precision_at_100
value: 1.397
- type: precision_at_1000
value: 0.169
- type: precision_at_3
value: 21.816
- type: precision_at_5
value: 15.476
- type: recall_at_1
value: 32.39
- type: recall_at_10
value: 63.597
- type: recall_at_100
value: 86.737
- type: recall_at_1000
value: 97.039
- type: recall_at_3
value: 48.906
- type: recall_at_5
value: 55.659000000000006
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 28.397
- type: map_at_10
value: 39.871
- type: map_at_100
value: 41.309000000000005
- type: map_at_1000
value: 41.409
- type: map_at_3
value: 36.047000000000004
- type: map_at_5
value: 38.104
- type: mrr_at_1
value: 34.703
- type: mrr_at_10
value: 44.773
- type: mrr_at_100
value: 45.64
- type: mrr_at_1000
value: 45.678999999999995
- type: mrr_at_3
value: 41.705
- type: mrr_at_5
value: 43.406
- type: ndcg_at_1
value: 34.703
- type: ndcg_at_10
value: 46.271
- type: ndcg_at_100
value: 52.037
- type: ndcg_at_1000
value: 53.81700000000001
- type: ndcg_at_3
value: 39.966
- type: ndcg_at_5
value: 42.801
- type: precision_at_1
value: 34.703
- type: precision_at_10
value: 8.744
- type: precision_at_100
value: 1.348
- type: precision_at_1000
value: 0.167
- type: precision_at_3
value: 19.102
- type: precision_at_5
value: 13.836
- type: recall_at_1
value: 28.397
- type: recall_at_10
value: 60.299
- type: recall_at_100
value: 84.595
- type: recall_at_1000
value: 96.155
- type: recall_at_3
value: 43.065
- type: recall_at_5
value: 50.371
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 28.044333333333338
- type: map_at_10
value: 38.78691666666666
- type: map_at_100
value: 40.113
- type: map_at_1000
value: 40.22125
- type: map_at_3
value: 35.52966666666667
- type: map_at_5
value: 37.372749999999996
- type: mrr_at_1
value: 33.159083333333335
- type: mrr_at_10
value: 42.913583333333335
- type: mrr_at_100
value: 43.7845
- type: mrr_at_1000
value: 43.830333333333336
- type: mrr_at_3
value: 40.29816666666667
- type: mrr_at_5
value: 41.81366666666667
- type: ndcg_at_1
value: 33.159083333333335
- type: ndcg_at_10
value: 44.75750000000001
- type: ndcg_at_100
value: 50.13658333333334
- type: ndcg_at_1000
value: 52.037
- type: ndcg_at_3
value: 39.34258333333334
- type: ndcg_at_5
value: 41.93708333333333
- type: precision_at_1
value: 33.159083333333335
- type: precision_at_10
value: 7.952416666666667
- type: precision_at_100
value: 1.2571666666666668
- type: precision_at_1000
value: 0.16099999999999998
- type: precision_at_3
value: 18.303833333333337
- type: precision_at_5
value: 13.057083333333333
- type: recall_at_1
value: 28.044333333333338
- type: recall_at_10
value: 58.237249999999996
- type: recall_at_100
value: 81.35391666666666
- type: recall_at_1000
value: 94.21283333333334
- type: recall_at_3
value: 43.32341666666667
- type: recall_at_5
value: 49.94908333333333
- type: map_at_1
value: 18.398
- type: map_at_10
value: 27.929
- type: map_at_100
value: 29.032999999999998
- type: map_at_1000
value: 29.126
- type: map_at_3
value: 25.070999999999998
- type: map_at_5
value: 26.583000000000002
- type: mrr_at_1
value: 19.963
- type: mrr_at_10
value: 29.997
- type: mrr_at_100
value: 30.9
- type: mrr_at_1000
value: 30.972
- type: mrr_at_3
value: 27.264
- type: mrr_at_5
value: 28.826
- type: ndcg_at_1
value: 19.963
- type: ndcg_at_10
value: 33.678999999999995
- type: ndcg_at_100
value: 38.931
- type: ndcg_at_1000
value: 41.379
- type: ndcg_at_3
value: 28.000000000000004
- type: ndcg_at_5
value: 30.637999999999998
- type: precision_at_1
value: 19.963
- type: precision_at_10
value: 5.7299999999999995
- type: precision_at_100
value: 0.902
- type: precision_at_1000
value: 0.122
- type: precision_at_3
value: 12.631
- type: precision_at_5
value: 9.057
- type: recall_at_1
value: 18.398
- type: recall_at_10
value: 49.254
- type: recall_at_100
value: 73.182
- type: recall_at_1000
value: 91.637
- type: recall_at_3
value: 34.06
- type: recall_at_5
value: 40.416000000000004
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 27.838
- type: map_at_10
value: 36.04
- type: map_at_100
value: 37.113
- type: map_at_1000
value: 37.204
- type: map_at_3
value: 33.585
- type: map_at_5
value: 34.845
- type: mrr_at_1
value: 30.982
- type: mrr_at_10
value: 39.105000000000004
- type: mrr_at_100
value: 39.98
- type: mrr_at_1000
value: 40.042
- type: mrr_at_3
value: 36.912
- type: mrr_at_5
value: 38.062000000000005
- type: ndcg_at_1
value: 30.982
- type: ndcg_at_10
value: 40.982
- type: ndcg_at_100
value: 46.092
- type: ndcg_at_1000
value: 48.25
- type: ndcg_at_3
value: 36.41
- type: ndcg_at_5
value: 38.379999999999995
- type: precision_at_1
value: 30.982
- type: precision_at_10
value: 6.534
- type: precision_at_100
value: 0.9820000000000001
- type: precision_at_1000
value: 0.124
- type: precision_at_3
value: 15.745999999999999
- type: precision_at_5
value: 10.828
- type: recall_at_1
value: 27.838
- type: recall_at_10
value: 52.971000000000004
- type: recall_at_100
value: 76.357
- type: recall_at_1000
value: 91.973
- type: recall_at_3
value: 40.157
- type: recall_at_5
value: 45.147999999999996
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 19.059
- type: map_at_10
value: 27.454
- type: map_at_100
value: 28.736
- type: map_at_1000
value: 28.865000000000002
- type: map_at_3
value: 24.773999999999997
- type: map_at_5
value: 26.266000000000002
- type: mrr_at_1
value: 23.125
- type: mrr_at_10
value: 31.267
- type: mrr_at_100
value: 32.32
- type: mrr_at_1000
value: 32.394
- type: mrr_at_3
value: 28.894
- type: mrr_at_5
value: 30.281000000000002
- type: ndcg_at_1
value: 23.125
- type: ndcg_at_10
value: 32.588
- type: ndcg_at_100
value: 38.432
- type: ndcg_at_1000
value: 41.214
- type: ndcg_at_3
value: 27.938000000000002
- type: ndcg_at_5
value: 30.127
- type: precision_at_1
value: 23.125
- type: precision_at_10
value: 5.9639999999999995
- type: precision_at_100
value: 1.047
- type: precision_at_1000
value: 0.148
- type: precision_at_3
value: 13.294
- type: precision_at_5
value: 9.628
- type: recall_at_1
value: 19.059
- type: recall_at_10
value: 44.25
- type: recall_at_100
value: 69.948
- type: recall_at_1000
value: 89.35300000000001
- type: recall_at_3
value: 31.114000000000004
- type: recall_at_5
value: 36.846000000000004
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 28.355999999999998
- type: map_at_10
value: 39.055
- type: map_at_100
value: 40.486
- type: map_at_1000
value: 40.571
- type: map_at_3
value: 35.69
- type: map_at_5
value: 37.605
- type: mrr_at_1
value: 33.302
- type: mrr_at_10
value: 42.986000000000004
- type: mrr_at_100
value: 43.957
- type: mrr_at_1000
value: 43.996
- type: mrr_at_3
value: 40.111999999999995
- type: mrr_at_5
value: 41.735
- type: ndcg_at_1
value: 33.302
- type: ndcg_at_10
value: 44.962999999999994
- type: ndcg_at_100
value: 50.917
- type: ndcg_at_1000
value: 52.622
- type: ndcg_at_3
value: 39.182
- type: ndcg_at_5
value: 41.939
- type: precision_at_1
value: 33.302
- type: precision_at_10
value: 7.779999999999999
- type: precision_at_100
value: 1.203
- type: precision_at_1000
value: 0.145
- type: precision_at_3
value: 18.035
- type: precision_at_5
value: 12.873000000000001
- type: recall_at_1
value: 28.355999999999998
- type: recall_at_10
value: 58.782000000000004
- type: recall_at_100
value: 84.02199999999999
- type: recall_at_1000
value: 95.511
- type: recall_at_3
value: 43.126999999999995
- type: recall_at_5
value: 50.14999999999999
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 27.391
- type: map_at_10
value: 37.523
- type: map_at_100
value: 39.312000000000005
- type: map_at_1000
value: 39.54
- type: map_at_3
value: 34.231
- type: map_at_5
value: 36.062
- type: mrr_at_1
value: 32.016
- type: mrr_at_10
value: 41.747
- type: mrr_at_100
value: 42.812
- type: mrr_at_1000
value: 42.844
- type: mrr_at_3
value: 39.129999999999995
- type: mrr_at_5
value: 40.524
- type: ndcg_at_1
value: 32.016
- type: ndcg_at_10
value: 43.826
- type: ndcg_at_100
value: 50.373999999999995
- type: ndcg_at_1000
value: 52.318
- type: ndcg_at_3
value: 38.479
- type: ndcg_at_5
value: 40.944
- type: precision_at_1
value: 32.016
- type: precision_at_10
value: 8.280999999999999
- type: precision_at_100
value: 1.6760000000000002
- type: precision_at_1000
value: 0.25
- type: precision_at_3
value: 18.05
- type: precision_at_5
value: 13.083
- type: recall_at_1
value: 27.391
- type: recall_at_10
value: 56.928999999999995
- type: recall_at_100
value: 85.169
- type: recall_at_1000
value: 96.665
- type: recall_at_3
value: 42.264
- type: recall_at_5
value: 48.556
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: mteb/climate-fever
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 19.681
- type: map_at_10
value: 32.741
- type: map_at_100
value: 34.811
- type: map_at_1000
value: 35.003
- type: map_at_3
value: 27.697
- type: map_at_5
value: 30.372
- type: mrr_at_1
value: 44.951
- type: mrr_at_10
value: 56.34400000000001
- type: mrr_at_100
value: 56.961
- type: mrr_at_1000
value: 56.987
- type: mrr_at_3
value: 53.681
- type: mrr_at_5
value: 55.407
- type: ndcg_at_1
value: 44.951
- type: ndcg_at_10
value: 42.905
- type: ndcg_at_100
value: 49.95
- type: ndcg_at_1000
value: 52.917
- type: ndcg_at_3
value: 36.815
- type: ndcg_at_5
value: 38.817
- type: precision_at_1
value: 44.951
- type: precision_at_10
value: 12.989999999999998
- type: precision_at_100
value: 2.068
- type: precision_at_1000
value: 0.263
- type: precision_at_3
value: 27.275
- type: precision_at_5
value: 20.365
- type: recall_at_1
value: 19.681
- type: recall_at_10
value: 48.272999999999996
- type: recall_at_100
value: 71.87400000000001
- type: recall_at_1000
value: 87.929
- type: recall_at_3
value: 32.653999999999996
- type: recall_at_5
value: 39.364
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: mteb/dbpedia
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 10.231
- type: map_at_10
value: 22.338
- type: map_at_100
value: 31.927
- type: map_at_1000
value: 33.87
- type: map_at_3
value: 15.559999999999999
- type: map_at_5
value: 18.239
- type: mrr_at_1
value: 75.0
- type: mrr_at_10
value: 81.303
- type: mrr_at_100
value: 81.523
- type: mrr_at_1000
value: 81.53
- type: mrr_at_3
value: 80.083
- type: mrr_at_5
value: 80.758
- type: ndcg_at_1
value: 64.625
- type: ndcg_at_10
value: 48.687000000000005
- type: ndcg_at_100
value: 52.791
- type: ndcg_at_1000
value: 60.041999999999994
- type: ndcg_at_3
value: 53.757999999999996
- type: ndcg_at_5
value: 50.76500000000001
- type: precision_at_1
value: 75.0
- type: precision_at_10
value: 38.3
- type: precision_at_100
value: 12.025
- type: precision_at_1000
value: 2.3970000000000002
- type: precision_at_3
value: 55.417
- type: precision_at_5
value: 47.5
- type: recall_at_1
value: 10.231
- type: recall_at_10
value: 27.697
- type: recall_at_100
value: 57.409
- type: recall_at_1000
value: 80.547
- type: recall_at_3
value: 16.668
- type: recall_at_5
value: 20.552
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 61.365
- type: f1
value: 56.7540827912991
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: mteb/fever
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 83.479
- type: map_at_10
value: 88.898
- type: map_at_100
value: 89.11
- type: map_at_1000
value: 89.12400000000001
- type: map_at_3
value: 88.103
- type: map_at_5
value: 88.629
- type: mrr_at_1
value: 89.934
- type: mrr_at_10
value: 93.91000000000001
- type: mrr_at_100
value: 93.937
- type: mrr_at_1000
value: 93.938
- type: mrr_at_3
value: 93.62700000000001
- type: mrr_at_5
value: 93.84599999999999
- type: ndcg_at_1
value: 89.934
- type: ndcg_at_10
value: 91.574
- type: ndcg_at_100
value: 92.238
- type: ndcg_at_1000
value: 92.45
- type: ndcg_at_3
value: 90.586
- type: ndcg_at_5
value: 91.16300000000001
- type: precision_at_1
value: 89.934
- type: precision_at_10
value: 10.555
- type: precision_at_100
value: 1.1159999999999999
- type: precision_at_1000
value: 0.11499999999999999
- type: precision_at_3
value: 33.588
- type: precision_at_5
value: 20.642
- type: recall_at_1
value: 83.479
- type: recall_at_10
value: 94.971
- type: recall_at_100
value: 97.397
- type: recall_at_1000
value: 98.666
- type: recall_at_3
value: 92.24799999999999
- type: recall_at_5
value: 93.797
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: mteb/fiqa
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 27.16
- type: map_at_10
value: 45.593
- type: map_at_100
value: 47.762
- type: map_at_1000
value: 47.899
- type: map_at_3
value: 39.237
- type: map_at_5
value: 42.970000000000006
- type: mrr_at_1
value: 52.623
- type: mrr_at_10
value: 62.637
- type: mrr_at_100
value: 63.169
- type: mrr_at_1000
value: 63.185
- type: mrr_at_3
value: 59.928000000000004
- type: mrr_at_5
value: 61.702999999999996
- type: ndcg_at_1
value: 52.623
- type: ndcg_at_10
value: 54.701
- type: ndcg_at_100
value: 61.263
- type: ndcg_at_1000
value: 63.134
- type: ndcg_at_3
value: 49.265
- type: ndcg_at_5
value: 51.665000000000006
- type: precision_at_1
value: 52.623
- type: precision_at_10
value: 15.185
- type: precision_at_100
value: 2.202
- type: precision_at_1000
value: 0.254
- type: precision_at_3
value: 32.767
- type: precision_at_5
value: 24.722
- type: recall_at_1
value: 27.16
- type: recall_at_10
value: 63.309000000000005
- type: recall_at_100
value: 86.722
- type: recall_at_1000
value: 97.505
- type: recall_at_3
value: 45.045
- type: recall_at_5
value: 54.02400000000001
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: mteb/hotpotqa
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 42.573
- type: map_at_10
value: 59.373
- type: map_at_100
value: 60.292
- type: map_at_1000
value: 60.358999999999995
- type: map_at_3
value: 56.159000000000006
- type: map_at_5
value: 58.123999999999995
- type: mrr_at_1
value: 85.14500000000001
- type: mrr_at_10
value: 89.25999999999999
- type: mrr_at_100
value: 89.373
- type: mrr_at_1000
value: 89.377
- type: mrr_at_3
value: 88.618
- type: mrr_at_5
value: 89.036
- type: ndcg_at_1
value: 85.14500000000001
- type: ndcg_at_10
value: 68.95
- type: ndcg_at_100
value: 71.95
- type: ndcg_at_1000
value: 73.232
- type: ndcg_at_3
value: 64.546
- type: ndcg_at_5
value: 66.945
- type: precision_at_1
value: 85.14500000000001
- type: precision_at_10
value: 13.865
- type: precision_at_100
value: 1.619
- type: precision_at_1000
value: 0.179
- type: precision_at_3
value: 39.703
- type: precision_at_5
value: 25.718000000000004
- type: recall_at_1
value: 42.573
- type: recall_at_10
value: 69.325
- type: recall_at_100
value: 80.932
- type: recall_at_1000
value: 89.446
- type: recall_at_3
value: 59.553999999999995
- type: recall_at_5
value: 64.294
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 95.8336
- type: ap
value: 93.78862962194073
- type: f1
value: 95.83192650728371
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: mteb/msmarco
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: map_at_1
value: 23.075000000000003
- type: map_at_10
value: 36.102000000000004
- type: map_at_100
value: 37.257
- type: map_at_1000
value: 37.3
- type: map_at_3
value: 32.144
- type: map_at_5
value: 34.359
- type: mrr_at_1
value: 23.711
- type: mrr_at_10
value: 36.671
- type: mrr_at_100
value: 37.763999999999996
- type: mrr_at_1000
value: 37.801
- type: mrr_at_3
value: 32.775
- type: mrr_at_5
value: 34.977000000000004
- type: ndcg_at_1
value: 23.711
- type: ndcg_at_10
value: 43.361
- type: ndcg_at_100
value: 48.839
- type: ndcg_at_1000
value: 49.88
- type: ndcg_at_3
value: 35.269
- type: ndcg_at_5
value: 39.224
- type: precision_at_1
value: 23.711
- type: precision_at_10
value: 6.866999999999999
- type: precision_at_100
value: 0.96
- type: precision_at_1000
value: 0.105
- type: precision_at_3
value: 15.096000000000002
- type: precision_at_5
value: 11.083
- type: recall_at_1
value: 23.075000000000003
- type: recall_at_10
value: 65.756
- type: recall_at_100
value: 90.88199999999999
- type: recall_at_1000
value: 98.739
- type: recall_at_3
value: 43.691
- type: recall_at_5
value: 53.15800000000001
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 97.69493844049248
- type: f1
value: 97.55048089616261
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 88.75968992248062
- type: f1
value: 72.26321223399123
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 82.40080699394754
- type: f1
value: 79.62590029057968
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 84.49562878278414
- type: f1
value: 84.0040193313333
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 39.386760057101945
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 37.89687154075537
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 33.94151656057482
- type: mrr
value: 35.32684700746953
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: mteb/nfcorpus
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 6.239999999999999
- type: map_at_10
value: 14.862
- type: map_at_100
value: 18.955
- type: map_at_1000
value: 20.694000000000003
- type: map_at_3
value: 10.683
- type: map_at_5
value: 12.674
- type: mrr_at_1
value: 50.15500000000001
- type: mrr_at_10
value: 59.697
- type: mrr_at_100
value: 60.095
- type: mrr_at_1000
value: 60.129999999999995
- type: mrr_at_3
value: 58.35900000000001
- type: mrr_at_5
value: 58.839
- type: ndcg_at_1
value: 48.452
- type: ndcg_at_10
value: 39.341
- type: ndcg_at_100
value: 35.866
- type: ndcg_at_1000
value: 45.111000000000004
- type: ndcg_at_3
value: 44.527
- type: ndcg_at_5
value: 42.946
- type: precision_at_1
value: 50.15500000000001
- type: precision_at_10
value: 29.536
- type: precision_at_100
value: 9.142
- type: precision_at_1000
value: 2.2849999999999997
- type: precision_at_3
value: 41.899
- type: precision_at_5
value: 37.647000000000006
- type: recall_at_1
value: 6.239999999999999
- type: recall_at_10
value: 19.278000000000002
- type: recall_at_100
value: 36.074
- type: recall_at_1000
value: 70.017
- type: recall_at_3
value: 12.066
- type: recall_at_5
value: 15.254000000000001
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: mteb/nq
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 39.75
- type: map_at_10
value: 56.443
- type: map_at_100
value: 57.233999999999995
- type: map_at_1000
value: 57.249
- type: map_at_3
value: 52.032999999999994
- type: map_at_5
value: 54.937999999999995
- type: mrr_at_1
value: 44.728
- type: mrr_at_10
value: 58.939
- type: mrr_at_100
value: 59.489000000000004
- type: mrr_at_1000
value: 59.499
- type: mrr_at_3
value: 55.711999999999996
- type: mrr_at_5
value: 57.89
- type: ndcg_at_1
value: 44.728
- type: ndcg_at_10
value: 63.998999999999995
- type: ndcg_at_100
value: 67.077
- type: ndcg_at_1000
value: 67.40899999999999
- type: ndcg_at_3
value: 56.266000000000005
- type: ndcg_at_5
value: 60.88
- type: precision_at_1
value: 44.728
- type: precision_at_10
value: 10.09
- type: precision_at_100
value: 1.1809999999999998
- type: precision_at_1000
value: 0.121
- type: precision_at_3
value: 25.145
- type: precision_at_5
value: 17.822
- type: recall_at_1
value: 39.75
- type: recall_at_10
value: 84.234
- type: recall_at_100
value: 97.055
- type: recall_at_1000
value: 99.517
- type: recall_at_3
value: 64.851
- type: recall_at_5
value: 75.343
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: mteb/quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 72.085
- type: map_at_10
value: 86.107
- type: map_at_100
value: 86.727
- type: map_at_1000
value: 86.74
- type: map_at_3
value: 83.21
- type: map_at_5
value: 85.06
- type: mrr_at_1
value: 82.94
- type: mrr_at_10
value: 88.845
- type: mrr_at_100
value: 88.926
- type: mrr_at_1000
value: 88.927
- type: mrr_at_3
value: 87.993
- type: mrr_at_5
value: 88.62299999999999
- type: ndcg_at_1
value: 82.97
- type: ndcg_at_10
value: 89.645
- type: ndcg_at_100
value: 90.717
- type: ndcg_at_1000
value: 90.78
- type: ndcg_at_3
value: 86.99900000000001
- type: ndcg_at_5
value: 88.52600000000001
- type: precision_at_1
value: 82.97
- type: precision_at_10
value: 13.569
- type: precision_at_100
value: 1.539
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 38.043
- type: precision_at_5
value: 24.992
- type: recall_at_1
value: 72.085
- type: recall_at_10
value: 96.262
- type: recall_at_100
value: 99.77000000000001
- type: recall_at_1000
value: 99.997
- type: recall_at_3
value: 88.652
- type: recall_at_5
value: 93.01899999999999
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 55.82153952668092
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 62.094465801879295
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: mteb/scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.688
- type: map_at_10
value: 15.201999999999998
- type: map_at_100
value: 18.096
- type: map_at_1000
value: 18.481
- type: map_at_3
value: 10.734
- type: map_at_5
value: 12.94
- type: mrr_at_1
value: 28.000000000000004
- type: mrr_at_10
value: 41.101
- type: mrr_at_100
value: 42.202
- type: mrr_at_1000
value: 42.228
- type: mrr_at_3
value: 37.683
- type: mrr_at_5
value: 39.708
- type: ndcg_at_1
value: 28.000000000000004
- type: ndcg_at_10
value: 24.976000000000003
- type: ndcg_at_100
value: 35.129
- type: ndcg_at_1000
value: 40.77
- type: ndcg_at_3
value: 23.787
- type: ndcg_at_5
value: 20.816000000000003
- type: precision_at_1
value: 28.000000000000004
- type: precision_at_10
value: 13.04
- type: precision_at_100
value: 2.761
- type: precision_at_1000
value: 0.41000000000000003
- type: precision_at_3
value: 22.6
- type: precision_at_5
value: 18.52
- type: recall_at_1
value: 5.688
- type: recall_at_10
value: 26.43
- type: recall_at_100
value: 56.02
- type: recall_at_1000
value: 83.21
- type: recall_at_3
value: 13.752
- type: recall_at_5
value: 18.777
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 85.15084859283178
- type: cos_sim_spearman
value: 80.49030614009419
- type: euclidean_pearson
value: 81.84574978672468
- type: euclidean_spearman
value: 79.89787150656818
- type: manhattan_pearson
value: 81.63076538567131
- type: manhattan_spearman
value: 79.69867352121841
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 84.64097921490992
- type: cos_sim_spearman
value: 77.25370084896514
- type: euclidean_pearson
value: 82.71210826468788
- type: euclidean_spearman
value: 78.50445584994826
- type: manhattan_pearson
value: 82.92580164330298
- type: manhattan_spearman
value: 78.69686891301019
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 87.24596417308994
- type: cos_sim_spearman
value: 87.79454220555091
- type: euclidean_pearson
value: 87.40242561671164
- type: euclidean_spearman
value: 88.25955597373556
- type: manhattan_pearson
value: 87.25160240485849
- type: manhattan_spearman
value: 88.155794979818
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 84.44914233422564
- type: cos_sim_spearman
value: 82.91015471820322
- type: euclidean_pearson
value: 84.7206656630327
- type: euclidean_spearman
value: 83.86408872059216
- type: manhattan_pearson
value: 84.72816725158454
- type: manhattan_spearman
value: 84.01603388572788
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 87.6168026237477
- type: cos_sim_spearman
value: 88.45414278092397
- type: euclidean_pearson
value: 88.57023240882022
- type: euclidean_spearman
value: 89.04102190922094
- type: manhattan_pearson
value: 88.66695535796354
- type: manhattan_spearman
value: 89.19898476680969
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 84.27925826089424
- type: cos_sim_spearman
value: 85.45291099550461
- type: euclidean_pearson
value: 83.63853036580834
- type: euclidean_spearman
value: 84.33468035821484
- type: manhattan_pearson
value: 83.72778773251596
- type: manhattan_spearman
value: 84.51583132445376
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 89.67375185692552
- type: cos_sim_spearman
value: 90.32542469203855
- type: euclidean_pearson
value: 89.63513717951847
- type: euclidean_spearman
value: 89.87760271003745
- type: manhattan_pearson
value: 89.28381452982924
- type: manhattan_spearman
value: 89.53568197785721
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 66.24644693819846
- type: cos_sim_spearman
value: 66.09889420525377
- type: euclidean_pearson
value: 63.72551583520747
- type: euclidean_spearman
value: 63.01385470780679
- type: manhattan_pearson
value: 64.09258157214097
- type: manhattan_spearman
value: 63.080517752822594
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 86.27321463839989
- type: cos_sim_spearman
value: 86.37572865993327
- type: euclidean_pearson
value: 86.36268020198149
- type: euclidean_spearman
value: 86.31089339478922
- type: manhattan_pearson
value: 86.4260445761947
- type: manhattan_spearman
value: 86.45885895320457
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 86.52456702387798
- type: mrr
value: 96.34556529164372
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: mteb/scifact
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 61.99400000000001
- type: map_at_10
value: 73.38799999999999
- type: map_at_100
value: 73.747
- type: map_at_1000
value: 73.75
- type: map_at_3
value: 70.04599999999999
- type: map_at_5
value: 72.095
- type: mrr_at_1
value: 65.0
- type: mrr_at_10
value: 74.42800000000001
- type: mrr_at_100
value: 74.722
- type: mrr_at_1000
value: 74.725
- type: mrr_at_3
value: 72.056
- type: mrr_at_5
value: 73.60600000000001
- type: ndcg_at_1
value: 65.0
- type: ndcg_at_10
value: 78.435
- type: ndcg_at_100
value: 79.922
- type: ndcg_at_1000
value: 80.00500000000001
- type: ndcg_at_3
value: 73.05199999999999
- type: ndcg_at_5
value: 75.98
- type: precision_at_1
value: 65.0
- type: precision_at_10
value: 10.5
- type: precision_at_100
value: 1.123
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 28.555999999999997
- type: precision_at_5
value: 19.0
- type: recall_at_1
value: 61.99400000000001
- type: recall_at_10
value: 92.72200000000001
- type: recall_at_100
value: 99.333
- type: recall_at_1000
value: 100.0
- type: recall_at_3
value: 78.739
- type: recall_at_5
value: 85.828
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.79009900990098
- type: cos_sim_ap
value: 95.3203137438653
- type: cos_sim_f1
value: 89.12386706948641
- type: cos_sim_precision
value: 89.75659229208925
- type: cos_sim_recall
value: 88.5
- type: dot_accuracy
value: 99.67821782178218
- type: dot_ap
value: 89.94069840000675
- type: dot_f1
value: 83.45902463549521
- type: dot_precision
value: 83.9231547017189
- type: dot_recall
value: 83.0
- type: euclidean_accuracy
value: 99.78613861386138
- type: euclidean_ap
value: 95.10648259135526
- type: euclidean_f1
value: 88.77338877338877
- type: euclidean_precision
value: 92.42424242424242
- type: euclidean_recall
value: 85.39999999999999
- type: manhattan_accuracy
value: 99.7950495049505
- type: manhattan_ap
value: 95.29987661320946
- type: manhattan_f1
value: 89.21313183949972
- type: manhattan_precision
value: 93.14472252448314
- type: manhattan_recall
value: 85.6
- type: max_accuracy
value: 99.7950495049505
- type: max_ap
value: 95.3203137438653
- type: max_f1
value: 89.21313183949972
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 67.65446577183913
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 46.30749237193961
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 54.91481849959949
- type: mrr
value: 55.853506175197346
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.08196549170419
- type: cos_sim_spearman
value: 31.16661390597077
- type: dot_pearson
value: 29.892258410943466
- type: dot_spearman
value: 30.51328811965085
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: mteb/trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.23900000000000002
- type: map_at_10
value: 2.173
- type: map_at_100
value: 14.24
- type: map_at_1000
value: 35.309000000000005
- type: map_at_3
value: 0.7100000000000001
- type: map_at_5
value: 1.163
- type: mrr_at_1
value: 92.0
- type: mrr_at_10
value: 96.0
- type: mrr_at_100
value: 96.0
- type: mrr_at_1000
value: 96.0
- type: mrr_at_3
value: 96.0
- type: mrr_at_5
value: 96.0
- type: ndcg_at_1
value: 90.0
- type: ndcg_at_10
value: 85.382
- type: ndcg_at_100
value: 68.03
- type: ndcg_at_1000
value: 61.021
- type: ndcg_at_3
value: 89.765
- type: ndcg_at_5
value: 88.444
- type: precision_at_1
value: 92.0
- type: precision_at_10
value: 88.0
- type: precision_at_100
value: 70.02000000000001
- type: precision_at_1000
value: 26.984
- type: precision_at_3
value: 94.0
- type: precision_at_5
value: 92.80000000000001
- type: recall_at_1
value: 0.23900000000000002
- type: recall_at_10
value: 2.313
- type: recall_at_100
value: 17.049
- type: recall_at_1000
value: 57.489999999999995
- type: recall_at_3
value: 0.737
- type: recall_at_5
value: 1.221
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: mteb/touche2020
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 2.75
- type: map_at_10
value: 11.29
- type: map_at_100
value: 18.032999999999998
- type: map_at_1000
value: 19.746
- type: map_at_3
value: 6.555
- type: map_at_5
value: 8.706999999999999
- type: mrr_at_1
value: 34.694
- type: mrr_at_10
value: 50.55
- type: mrr_at_100
value: 51.659
- type: mrr_at_1000
value: 51.659
- type: mrr_at_3
value: 47.278999999999996
- type: mrr_at_5
value: 49.728
- type: ndcg_at_1
value: 32.653
- type: ndcg_at_10
value: 27.894000000000002
- type: ndcg_at_100
value: 39.769
- type: ndcg_at_1000
value: 51.495999999999995
- type: ndcg_at_3
value: 32.954
- type: ndcg_at_5
value: 31.502999999999997
- type: precision_at_1
value: 34.694
- type: precision_at_10
value: 23.265
- type: precision_at_100
value: 7.898
- type: precision_at_1000
value: 1.58
- type: precision_at_3
value: 34.694
- type: precision_at_5
value: 31.429000000000002
- type: recall_at_1
value: 2.75
- type: recall_at_10
value: 16.953
- type: recall_at_100
value: 48.68
- type: recall_at_1000
value: 85.18599999999999
- type: recall_at_3
value: 7.710999999999999
- type: recall_at_5
value: 11.484
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 82.66099999999999
- type: ap
value: 25.555698090238337
- type: f1
value: 66.48402012461622
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 72.94567062818335
- type: f1
value: 73.28139189595674
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 49.581627240203474
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 87.78089050485785
- type: cos_sim_ap
value: 79.64487116574168
- type: cos_sim_f1
value: 72.46563021970964
- type: cos_sim_precision
value: 70.62359128474831
- type: cos_sim_recall
value: 74.40633245382587
- type: dot_accuracy
value: 86.2609524944865
- type: dot_ap
value: 75.513046857613
- type: dot_f1
value: 68.58213616489695
- type: dot_precision
value: 65.12455516014235
- type: dot_recall
value: 72.42744063324538
- type: euclidean_accuracy
value: 87.6080348095607
- type: euclidean_ap
value: 79.00204933649795
- type: euclidean_f1
value: 72.14495342605589
- type: euclidean_precision
value: 69.85421299728193
- type: euclidean_recall
value: 74.5910290237467
- type: manhattan_accuracy
value: 87.59611372712642
- type: manhattan_ap
value: 78.78523756706264
- type: manhattan_f1
value: 71.86499137718648
- type: manhattan_precision
value: 67.39833641404806
- type: manhattan_recall
value: 76.96569920844327
- type: max_accuracy
value: 87.78089050485785
- type: max_ap
value: 79.64487116574168
- type: max_f1
value: 72.46563021970964
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.98719292117825
- type: cos_sim_ap
value: 87.58146137353202
- type: cos_sim_f1
value: 80.28543232369239
- type: cos_sim_precision
value: 79.1735289714029
- type: cos_sim_recall
value: 81.42901139513397
- type: dot_accuracy
value: 88.9199363526992
- type: dot_ap
value: 84.98499998630417
- type: dot_f1
value: 78.21951400757969
- type: dot_precision
value: 75.58523624874336
- type: dot_recall
value: 81.04404065291038
- type: euclidean_accuracy
value: 89.77374160748244
- type: euclidean_ap
value: 87.35151562835209
- type: euclidean_f1
value: 79.92160922940393
- type: euclidean_precision
value: 76.88531587933979
- type: euclidean_recall
value: 83.20757622420696
- type: manhattan_accuracy
value: 89.72717041176699
- type: manhattan_ap
value: 87.34065592142515
- type: manhattan_f1
value: 79.85603419187943
- type: manhattan_precision
value: 77.82243332115455
- type: manhattan_recall
value: 81.99876809362489
- type: max_accuracy
value: 89.98719292117825
- type: max_ap
value: 87.58146137353202
- type: max_f1
value: 80.28543232369239
- task:
type: STS
dataset:
name: MTEB AFQMC
type: C-MTEB/AFQMC
config: default
split: validation
revision: b44c3b011063adb25877c13823db83bb193913c4
metrics:
- type: cos_sim_pearson
value: 53.45954203592337
- type: cos_sim_spearman
value: 58.42154680418638
- type: euclidean_pearson
value: 56.41543791722753
- type: euclidean_spearman
value: 58.39328016640146
- type: manhattan_pearson
value: 56.318510356833876
- type: manhattan_spearman
value: 58.28423447818184
- task:
type: STS
dataset:
name: MTEB ATEC
type: C-MTEB/ATEC
config: default
split: test
revision: 0f319b1142f28d00e055a6770f3f726ae9b7d865
metrics:
- type: cos_sim_pearson
value: 50.78356460675945
- type: cos_sim_spearman
value: 55.6530411663269
- type: euclidean_pearson
value: 56.50763660417816
- type: euclidean_spearman
value: 55.733823335669065
- type: manhattan_pearson
value: 56.45323093512866
- type: manhattan_spearman
value: 55.63248619032702
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (zh)
type: mteb/amazon_reviews_multi
config: zh
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 47.209999999999994
- type: f1
value: 46.08892432018655
- task:
type: STS
dataset:
name: MTEB BQ
type: C-MTEB/BQ
config: default
split: test
revision: e3dda5e115e487b39ec7e618c0c6a29137052a55
metrics:
- type: cos_sim_pearson
value: 70.25573992001478
- type: cos_sim_spearman
value: 73.85247134951433
- type: euclidean_pearson
value: 72.60033082168442
- type: euclidean_spearman
value: 73.72445893756499
- type: manhattan_pearson
value: 72.59932284620231
- type: manhattan_spearman
value: 73.68002490614583
- task:
type: Clustering
dataset:
name: MTEB CLSClusteringP2P
type: C-MTEB/CLSClusteringP2P
config: default
split: test
revision: 4b6227591c6c1a73bc76b1055f3b7f3588e72476
metrics:
- type: v_measure
value: 45.21317724305628
- task:
type: Clustering
dataset:
name: MTEB CLSClusteringS2S
type: C-MTEB/CLSClusteringS2S
config: default
split: test
revision: e458b3f5414b62b7f9f83499ac1f5497ae2e869f
metrics:
- type: v_measure
value: 42.49825170976724
- task:
type: Reranking
dataset:
name: MTEB CMedQAv1
type: C-MTEB/CMedQAv1-reranking
config: default
split: test
revision: 8d7f1e942507dac42dc58017c1a001c3717da7df
metrics:
- type: map
value: 88.15661686810597
- type: mrr
value: 90.11222222222223
- task:
type: Reranking
dataset:
name: MTEB CMedQAv2
type: C-MTEB/CMedQAv2-reranking
config: default
split: test
revision: 23d186750531a14a0357ca22cd92d712fd512ea0
metrics:
- type: map
value: 88.1204726064383
- type: mrr
value: 90.20142857142858
- task:
type: Retrieval
dataset:
name: MTEB CmedqaRetrieval
type: C-MTEB/CmedqaRetrieval
config: default
split: dev
revision: cd540c506dae1cf9e9a59c3e06f42030d54e7301
metrics:
- type: map_at_1
value: 27.224999999999998
- type: map_at_10
value: 40.169
- type: map_at_100
value: 42.0
- type: map_at_1000
value: 42.109
- type: map_at_3
value: 35.76
- type: map_at_5
value: 38.221
- type: mrr_at_1
value: 40.56
- type: mrr_at_10
value: 49.118
- type: mrr_at_100
value: 50.092999999999996
- type: mrr_at_1000
value: 50.133
- type: mrr_at_3
value: 46.507
- type: mrr_at_5
value: 47.973
- type: ndcg_at_1
value: 40.56
- type: ndcg_at_10
value: 46.972
- type: ndcg_at_100
value: 54.04
- type: ndcg_at_1000
value: 55.862
- type: ndcg_at_3
value: 41.36
- type: ndcg_at_5
value: 43.704
- type: precision_at_1
value: 40.56
- type: precision_at_10
value: 10.302999999999999
- type: precision_at_100
value: 1.606
- type: precision_at_1000
value: 0.184
- type: precision_at_3
value: 23.064
- type: precision_at_5
value: 16.764000000000003
- type: recall_at_1
value: 27.224999999999998
- type: recall_at_10
value: 58.05200000000001
- type: recall_at_100
value: 87.092
- type: recall_at_1000
value: 99.099
- type: recall_at_3
value: 41.373
- type: recall_at_5
value: 48.453
- task:
type: PairClassification
dataset:
name: MTEB Cmnli
type: C-MTEB/CMNLI
config: default
split: validation
revision: 41bc36f332156f7adc9e38f53777c959b2ae9766
metrics:
- type: cos_sim_accuracy
value: 77.40228502705953
- type: cos_sim_ap
value: 86.22359172956327
- type: cos_sim_f1
value: 78.96328293736501
- type: cos_sim_precision
value: 73.36945615091311
- type: cos_sim_recall
value: 85.48047696983868
- type: dot_accuracy
value: 75.53818400481059
- type: dot_ap
value: 83.70164011305312
- type: dot_f1
value: 77.67298719348754
- type: dot_precision
value: 67.49482401656314
- type: dot_recall
value: 91.46598082768296
- type: euclidean_accuracy
value: 77.94347564642213
- type: euclidean_ap
value: 86.4652108728609
- type: euclidean_f1
value: 79.15555555555555
- type: euclidean_precision
value: 75.41816641964853
- type: euclidean_recall
value: 83.28267477203647
- type: manhattan_accuracy
value: 77.45039085989175
- type: manhattan_ap
value: 86.09986583900665
- type: manhattan_f1
value: 78.93669264438988
- type: manhattan_precision
value: 72.63261296660117
- type: manhattan_recall
value: 86.43909282207154
- type: max_accuracy
value: 77.94347564642213
- type: max_ap
value: 86.4652108728609
- type: max_f1
value: 79.15555555555555
- task:
type: Retrieval
dataset:
name: MTEB CovidRetrieval
type: C-MTEB/CovidRetrieval
config: default
split: dev
revision: 1271c7809071a13532e05f25fb53511ffce77117
metrics:
- type: map_at_1
value: 69.336
- type: map_at_10
value: 77.16
- type: map_at_100
value: 77.47500000000001
- type: map_at_1000
value: 77.482
- type: map_at_3
value: 75.42999999999999
- type: map_at_5
value: 76.468
- type: mrr_at_1
value: 69.44200000000001
- type: mrr_at_10
value: 77.132
- type: mrr_at_100
value: 77.43299999999999
- type: mrr_at_1000
value: 77.44
- type: mrr_at_3
value: 75.395
- type: mrr_at_5
value: 76.459
- type: ndcg_at_1
value: 69.547
- type: ndcg_at_10
value: 80.794
- type: ndcg_at_100
value: 82.245
- type: ndcg_at_1000
value: 82.40899999999999
- type: ndcg_at_3
value: 77.303
- type: ndcg_at_5
value: 79.168
- type: precision_at_1
value: 69.547
- type: precision_at_10
value: 9.305
- type: precision_at_100
value: 0.9979999999999999
- type: precision_at_1000
value: 0.101
- type: precision_at_3
value: 27.749000000000002
- type: precision_at_5
value: 17.576
- type: recall_at_1
value: 69.336
- type: recall_at_10
value: 92.097
- type: recall_at_100
value: 98.736
- type: recall_at_1000
value: 100.0
- type: recall_at_3
value: 82.64
- type: recall_at_5
value: 87.144
- task:
type: Retrieval
dataset:
name: MTEB DuRetrieval
type: C-MTEB/DuRetrieval
config: default
split: dev
revision: a1a333e290fe30b10f3f56498e3a0d911a693ced
metrics:
- type: map_at_1
value: 26.817999999999998
- type: map_at_10
value: 82.67
- type: map_at_100
value: 85.304
- type: map_at_1000
value: 85.334
- type: map_at_3
value: 57.336
- type: map_at_5
value: 72.474
- type: mrr_at_1
value: 91.45
- type: mrr_at_10
value: 94.272
- type: mrr_at_100
value: 94.318
- type: mrr_at_1000
value: 94.32000000000001
- type: mrr_at_3
value: 94.0
- type: mrr_at_5
value: 94.17699999999999
- type: ndcg_at_1
value: 91.45
- type: ndcg_at_10
value: 89.404
- type: ndcg_at_100
value: 91.724
- type: ndcg_at_1000
value: 91.973
- type: ndcg_at_3
value: 88.104
- type: ndcg_at_5
value: 87.25699999999999
- type: precision_at_1
value: 91.45
- type: precision_at_10
value: 42.585
- type: precision_at_100
value: 4.838
- type: precision_at_1000
value: 0.49
- type: precision_at_3
value: 78.8
- type: precision_at_5
value: 66.66
- type: recall_at_1
value: 26.817999999999998
- type: recall_at_10
value: 90.67
- type: recall_at_100
value: 98.36200000000001
- type: recall_at_1000
value: 99.583
- type: recall_at_3
value: 59.614999999999995
- type: recall_at_5
value: 77.05199999999999
- task:
type: Retrieval
dataset:
name: MTEB EcomRetrieval
type: C-MTEB/EcomRetrieval
config: default
split: dev
revision: 687de13dc7294d6fd9be10c6945f9e8fec8166b9
metrics:
- type: map_at_1
value: 47.699999999999996
- type: map_at_10
value: 57.589999999999996
- type: map_at_100
value: 58.226
- type: map_at_1000
value: 58.251
- type: map_at_3
value: 55.233
- type: map_at_5
value: 56.633
- type: mrr_at_1
value: 47.699999999999996
- type: mrr_at_10
value: 57.589999999999996
- type: mrr_at_100
value: 58.226
- type: mrr_at_1000
value: 58.251
- type: mrr_at_3
value: 55.233
- type: mrr_at_5
value: 56.633
- type: ndcg_at_1
value: 47.699999999999996
- type: ndcg_at_10
value: 62.505
- type: ndcg_at_100
value: 65.517
- type: ndcg_at_1000
value: 66.19800000000001
- type: ndcg_at_3
value: 57.643
- type: ndcg_at_5
value: 60.181
- type: precision_at_1
value: 47.699999999999996
- type: precision_at_10
value: 7.8
- type: precision_at_100
value: 0.919
- type: precision_at_1000
value: 0.097
- type: precision_at_3
value: 21.532999999999998
- type: precision_at_5
value: 14.16
- type: recall_at_1
value: 47.699999999999996
- type: recall_at_10
value: 78.0
- type: recall_at_100
value: 91.9
- type: recall_at_1000
value: 97.3
- type: recall_at_3
value: 64.60000000000001
- type: recall_at_5
value: 70.8
- task:
type: Classification
dataset:
name: MTEB IFlyTek
type: C-MTEB/IFlyTek-classification
config: default
split: validation
revision: 421605374b29664c5fc098418fe20ada9bd55f8a
metrics:
- type: accuracy
value: 44.84801846864178
- type: f1
value: 37.47347897956339
- task:
type: Classification
dataset:
name: MTEB JDReview
type: C-MTEB/JDReview-classification
config: default
split: test
revision: b7c64bd89eb87f8ded463478346f76731f07bf8b
metrics:
- type: accuracy
value: 85.81613508442777
- type: ap
value: 52.68244615477374
- type: f1
value: 80.0445640948843
- task:
type: STS
dataset:
name: MTEB LCQMC
type: C-MTEB/LCQMC
config: default
split: test
revision: 17f9b096f80380fce5ed12a9be8be7784b337daf
metrics:
- type: cos_sim_pearson
value: 69.57786502217138
- type: cos_sim_spearman
value: 75.39106054489906
- type: euclidean_pearson
value: 73.72082954602402
- type: euclidean_spearman
value: 75.14421475913619
- type: manhattan_pearson
value: 73.62463076633642
- type: manhattan_spearman
value: 75.01301565104112
- task:
type: Reranking
dataset:
name: MTEB MMarcoReranking
type: C-MTEB/Mmarco-reranking
config: default
split: dev
revision: None
metrics:
- type: map
value: 29.143797057999134
- type: mrr
value: 28.08174603174603
- task:
type: Retrieval
dataset:
name: MTEB MMarcoRetrieval
type: C-MTEB/MMarcoRetrieval
config: default
split: dev
revision: 539bbde593d947e2a124ba72651aafc09eb33fc2
metrics:
- type: map_at_1
value: 70.492
- type: map_at_10
value: 79.501
- type: map_at_100
value: 79.728
- type: map_at_1000
value: 79.735
- type: map_at_3
value: 77.77
- type: map_at_5
value: 78.851
- type: mrr_at_1
value: 72.822
- type: mrr_at_10
value: 80.001
- type: mrr_at_100
value: 80.19
- type: mrr_at_1000
value: 80.197
- type: mrr_at_3
value: 78.484
- type: mrr_at_5
value: 79.42099999999999
- type: ndcg_at_1
value: 72.822
- type: ndcg_at_10
value: 83.013
- type: ndcg_at_100
value: 84.013
- type: ndcg_at_1000
value: 84.20400000000001
- type: ndcg_at_3
value: 79.728
- type: ndcg_at_5
value: 81.542
- type: precision_at_1
value: 72.822
- type: precision_at_10
value: 9.917
- type: precision_at_100
value: 1.042
- type: precision_at_1000
value: 0.106
- type: precision_at_3
value: 29.847
- type: precision_at_5
value: 18.871
- type: recall_at_1
value: 70.492
- type: recall_at_10
value: 93.325
- type: recall_at_100
value: 97.822
- type: recall_at_1000
value: 99.319
- type: recall_at_3
value: 84.636
- type: recall_at_5
value: 88.93100000000001
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (zh-CN)
type: mteb/amazon_massive_intent
config: zh-CN
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 76.88298587760592
- type: f1
value: 73.89001762017176
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (zh-CN)
type: mteb/amazon_massive_scenario
config: zh-CN
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 80.76328177538669
- type: f1
value: 80.24718532423358
- task:
type: Retrieval
dataset:
name: MTEB MedicalRetrieval
type: C-MTEB/MedicalRetrieval
config: default
split: dev
revision: 2039188fb5800a9803ba5048df7b76e6fb151fc6
metrics:
- type: map_at_1
value: 49.6
- type: map_at_10
value: 55.620999999999995
- type: map_at_100
value: 56.204
- type: map_at_1000
value: 56.251
- type: map_at_3
value: 54.132999999999996
- type: map_at_5
value: 54.933
- type: mrr_at_1
value: 49.7
- type: mrr_at_10
value: 55.67100000000001
- type: mrr_at_100
value: 56.254000000000005
- type: mrr_at_1000
value: 56.301
- type: mrr_at_3
value: 54.18300000000001
- type: mrr_at_5
value: 54.983000000000004
- type: ndcg_at_1
value: 49.6
- type: ndcg_at_10
value: 58.645
- type: ndcg_at_100
value: 61.789
- type: ndcg_at_1000
value: 63.219
- type: ndcg_at_3
value: 55.567
- type: ndcg_at_5
value: 57.008
- type: precision_at_1
value: 49.6
- type: precision_at_10
value: 6.819999999999999
- type: precision_at_100
value: 0.836
- type: precision_at_1000
value: 0.095
- type: precision_at_3
value: 19.900000000000002
- type: precision_at_5
value: 12.64
- type: recall_at_1
value: 49.6
- type: recall_at_10
value: 68.2
- type: recall_at_100
value: 83.6
- type: recall_at_1000
value: 95.3
- type: recall_at_3
value: 59.699999999999996
- type: recall_at_5
value: 63.2
- task:
type: Classification
dataset:
name: MTEB MultilingualSentiment
type: C-MTEB/MultilingualSentiment-classification
config: default
split: validation
revision: 46958b007a63fdbf239b7672c25d0bea67b5ea1a
metrics:
- type: accuracy
value: 74.45666666666666
- type: f1
value: 74.32582402190089
- task:
type: PairClassification
dataset:
name: MTEB Ocnli
type: C-MTEB/OCNLI
config: default
split: validation
revision: 66e76a618a34d6d565d5538088562851e6daa7ec
metrics:
- type: cos_sim_accuracy
value: 80.67135896047645
- type: cos_sim_ap
value: 87.60421240712051
- type: cos_sim_f1
value: 82.1304131408661
- type: cos_sim_precision
value: 77.68361581920904
- type: cos_sim_recall
value: 87.11721224920802
- type: dot_accuracy
value: 79.04710341093666
- type: dot_ap
value: 85.6370059719336
- type: dot_f1
value: 80.763723150358
- type: dot_precision
value: 73.69337979094077
- type: dot_recall
value: 89.33474128827878
- type: euclidean_accuracy
value: 81.05035192203573
- type: euclidean_ap
value: 87.7880240053663
- type: euclidean_f1
value: 82.50244379276637
- type: euclidean_precision
value: 76.7970882620564
- type: euclidean_recall
value: 89.1235480464625
- type: manhattan_accuracy
value: 80.61721710882512
- type: manhattan_ap
value: 87.43568120591175
- type: manhattan_f1
value: 81.89526184538653
- type: manhattan_precision
value: 77.5992438563327
- type: manhattan_recall
value: 86.6948257655755
- type: max_accuracy
value: 81.05035192203573
- type: max_ap
value: 87.7880240053663
- type: max_f1
value: 82.50244379276637
- task:
type: Classification
dataset:
name: MTEB OnlineShopping
type: C-MTEB/OnlineShopping-classification
config: default
split: test
revision: e610f2ebd179a8fda30ae534c3878750a96db120
metrics:
- type: accuracy
value: 93.5
- type: ap
value: 91.31357903446782
- type: f1
value: 93.48088994006616
- task:
type: STS
dataset:
name: MTEB PAWSX
type: C-MTEB/PAWSX
config: default
split: test
revision: 9c6a90e430ac22b5779fb019a23e820b11a8b5e1
metrics:
- type: cos_sim_pearson
value: 36.93293453538077
- type: cos_sim_spearman
value: 42.45972506308574
- type: euclidean_pearson
value: 42.34945133152159
- type: euclidean_spearman
value: 42.331610303674644
- type: manhattan_pearson
value: 42.31455070249498
- type: manhattan_spearman
value: 42.19887982891834
- task:
type: STS
dataset:
name: MTEB QBQTC
type: C-MTEB/QBQTC
config: default
split: test
revision: 790b0510dc52b1553e8c49f3d2afb48c0e5c48b7
metrics:
- type: cos_sim_pearson
value: 33.683290790043785
- type: cos_sim_spearman
value: 35.149171171202994
- type: euclidean_pearson
value: 32.33806561267862
- type: euclidean_spearman
value: 34.483576387347966
- type: manhattan_pearson
value: 32.47629754599608
- type: manhattan_spearman
value: 34.66434471867615
- task:
type: STS
dataset:
name: MTEB STS22 (zh)
type: mteb/sts22-crosslingual-sts
config: zh
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 66.46322760516104
- type: cos_sim_spearman
value: 67.398478319726
- type: euclidean_pearson
value: 64.7223480293625
- type: euclidean_spearman
value: 66.83118568812951
- type: manhattan_pearson
value: 64.88440039828305
- type: manhattan_spearman
value: 66.80429458952257
- task:
type: STS
dataset:
name: MTEB STSB
type: C-MTEB/STSB
config: default
split: test
revision: 0cde68302b3541bb8b3c340dc0644b0b745b3dc0
metrics:
- type: cos_sim_pearson
value: 79.08991383232105
- type: cos_sim_spearman
value: 79.39715677296854
- type: euclidean_pearson
value: 78.63201279320496
- type: euclidean_spearman
value: 79.40262660785731
- type: manhattan_pearson
value: 78.98138363146906
- type: manhattan_spearman
value: 79.79968413014194
- task:
type: Reranking
dataset:
name: MTEB T2Reranking
type: C-MTEB/T2Reranking
config: default
split: dev
revision: 76631901a18387f85eaa53e5450019b87ad58ef9
metrics:
- type: map
value: 67.43289278789972
- type: mrr
value: 77.53012460908535
- task:
type: Retrieval
dataset:
name: MTEB T2Retrieval
type: C-MTEB/T2Retrieval
config: default
split: dev
revision: 8731a845f1bf500a4f111cf1070785c793d10e64
metrics:
- type: map_at_1
value: 27.733999999999998
- type: map_at_10
value: 78.24799999999999
- type: map_at_100
value: 81.765
- type: map_at_1000
value: 81.824
- type: map_at_3
value: 54.92
- type: map_at_5
value: 67.61399999999999
- type: mrr_at_1
value: 90.527
- type: mrr_at_10
value: 92.843
- type: mrr_at_100
value: 92.927
- type: mrr_at_1000
value: 92.93
- type: mrr_at_3
value: 92.45100000000001
- type: mrr_at_5
value: 92.693
- type: ndcg_at_1
value: 90.527
- type: ndcg_at_10
value: 85.466
- type: ndcg_at_100
value: 88.846
- type: ndcg_at_1000
value: 89.415
- type: ndcg_at_3
value: 86.768
- type: ndcg_at_5
value: 85.46000000000001
- type: precision_at_1
value: 90.527
- type: precision_at_10
value: 42.488
- type: precision_at_100
value: 5.024
- type: precision_at_1000
value: 0.516
- type: precision_at_3
value: 75.907
- type: precision_at_5
value: 63.727000000000004
- type: recall_at_1
value: 27.733999999999998
- type: recall_at_10
value: 84.346
- type: recall_at_100
value: 95.536
- type: recall_at_1000
value: 98.42999999999999
- type: recall_at_3
value: 56.455
- type: recall_at_5
value: 70.755
- task:
type: Classification
dataset:
name: MTEB TNews
type: C-MTEB/TNews-classification
config: default
split: validation
revision: 317f262bf1e6126357bbe89e875451e4b0938fe4
metrics:
- type: accuracy
value: 49.952000000000005
- type: f1
value: 48.264617195258054
- task:
type: Clustering
dataset:
name: MTEB ThuNewsClusteringP2P
type: C-MTEB/ThuNewsClusteringP2P
config: default
split: test
revision: 5798586b105c0434e4f0fe5e767abe619442cf93
metrics:
- type: v_measure
value: 68.23769904483508
- task:
type: Clustering
dataset:
name: MTEB ThuNewsClusteringS2S
type: C-MTEB/ThuNewsClusteringS2S
config: default
split: test
revision: 8a8b2caeda43f39e13c4bc5bea0f8a667896e10d
metrics:
- type: v_measure
value: 62.50294403136556
- task:
type: Retrieval
dataset:
name: MTEB VideoRetrieval
type: C-MTEB/VideoRetrieval
config: default
split: dev
revision: 58c2597a5943a2ba48f4668c3b90d796283c5639
metrics:
- type: map_at_1
value: 54.0
- type: map_at_10
value: 63.668
- type: map_at_100
value: 64.217
- type: map_at_1000
value: 64.23100000000001
- type: map_at_3
value: 61.7
- type: map_at_5
value: 62.870000000000005
- type: mrr_at_1
value: 54.0
- type: mrr_at_10
value: 63.668
- type: mrr_at_100
value: 64.217
- type: mrr_at_1000
value: 64.23100000000001
- type: mrr_at_3
value: 61.7
- type: mrr_at_5
value: 62.870000000000005
- type: ndcg_at_1
value: 54.0
- type: ndcg_at_10
value: 68.11399999999999
- type: ndcg_at_100
value: 70.723
- type: ndcg_at_1000
value: 71.123
- type: ndcg_at_3
value: 64.074
- type: ndcg_at_5
value: 66.178
- type: precision_at_1
value: 54.0
- type: precision_at_10
value: 8.200000000000001
- type: precision_at_100
value: 0.941
- type: precision_at_1000
value: 0.097
- type: precision_at_3
value: 23.633000000000003
- type: precision_at_5
value: 15.2
- type: recall_at_1
value: 54.0
- type: recall_at_10
value: 82.0
- type: recall_at_100
value: 94.1
- type: recall_at_1000
value: 97.3
- type: recall_at_3
value: 70.89999999999999
- type: recall_at_5
value: 76.0
- task:
type: Classification
dataset:
name: MTEB Waimai
type: C-MTEB/waimai-classification
config: default
split: test
revision: 339287def212450dcaa9df8c22bf93e9980c7023
metrics:
- type: accuracy
value: 86.63000000000001
- type: ap
value: 69.99457882599567
- type: f1
value: 85.07735617998541
- task:
type: Clustering
dataset:
name: MTEB 8TagsClustering
type: PL-MTEB/8tags-clustering
config: default
split: test
revision: None
metrics:
- type: v_measure
value: 44.594104491193555
- task:
type: Classification
dataset:
name: MTEB AllegroReviews
type: PL-MTEB/allegro-reviews
config: default
split: test
revision: None
metrics:
- type: accuracy
value: 63.97614314115309
- type: f1
value: 52.15634261679283
- task:
type: Retrieval
dataset:
name: MTEB ArguAna-PL
type: clarin-knext/arguana-pl
config: default
split: test
revision: 63fc86750af76253e8c760fc9e534bbf24d260a2
metrics:
- type: map_at_1
value: 32.646
- type: map_at_10
value: 47.963
- type: map_at_100
value: 48.789
- type: map_at_1000
value: 48.797000000000004
- type: map_at_3
value: 43.196
- type: map_at_5
value: 46.016
- type: mrr_at_1
value: 33.073
- type: mrr_at_10
value: 48.126000000000005
- type: mrr_at_100
value: 48.946
- type: mrr_at_1000
value: 48.953
- type: mrr_at_3
value: 43.374
- type: mrr_at_5
value: 46.147
- type: ndcg_at_1
value: 32.646
- type: ndcg_at_10
value: 56.481
- type: ndcg_at_100
value: 59.922
- type: ndcg_at_1000
value: 60.07
- type: ndcg_at_3
value: 46.675
- type: ndcg_at_5
value: 51.76500000000001
- type: precision_at_1
value: 32.646
- type: precision_at_10
value: 8.371
- type: precision_at_100
value: 0.9860000000000001
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 18.919
- type: precision_at_5
value: 13.825999999999999
- type: recall_at_1
value: 32.646
- type: recall_at_10
value: 83.71300000000001
- type: recall_at_100
value: 98.578
- type: recall_at_1000
value: 99.644
- type: recall_at_3
value: 56.757000000000005
- type: recall_at_5
value: 69.132
- task:
type: Classification
dataset:
name: MTEB CBD
type: PL-MTEB/cbd
config: default
split: test
revision: None
metrics:
- type: accuracy
value: 68.56
- type: ap
value: 23.310493680488513
- type: f1
value: 58.85369533105693
- task:
type: PairClassification
dataset:
name: MTEB CDSC-E
type: PL-MTEB/cdsce-pairclassification
config: default
split: test
revision: None
metrics:
- type: cos_sim_accuracy
value: 88.5
- type: cos_sim_ap
value: 72.42140924378361
- type: cos_sim_f1
value: 66.0919540229885
- type: cos_sim_precision
value: 72.78481012658227
- type: cos_sim_recall
value: 60.526315789473685
- type: dot_accuracy
value: 88.5
- type: dot_ap
value: 72.42140924378361
- type: dot_f1
value: 66.0919540229885
- type: dot_precision
value: 72.78481012658227
- type: dot_recall
value: 60.526315789473685
- type: euclidean_accuracy
value: 88.5
- type: euclidean_ap
value: 72.42140924378361
- type: euclidean_f1
value: 66.0919540229885
- type: euclidean_precision
value: 72.78481012658227
- type: euclidean_recall
value: 60.526315789473685
- type: manhattan_accuracy
value: 88.5
- type: manhattan_ap
value: 72.49745515311696
- type: manhattan_f1
value: 66.0968660968661
- type: manhattan_precision
value: 72.04968944099379
- type: manhattan_recall
value: 61.05263157894737
- type: max_accuracy
value: 88.5
- type: max_ap
value: 72.49745515311696
- type: max_f1
value: 66.0968660968661
- task:
type: STS
dataset:
name: MTEB CDSC-R
type: PL-MTEB/cdscr-sts
config: default
split: test
revision: None
metrics:
- type: cos_sim_pearson
value: 90.32269765590145
- type: cos_sim_spearman
value: 89.73666311491672
- type: euclidean_pearson
value: 88.2933868516544
- type: euclidean_spearman
value: 89.73666311491672
- type: manhattan_pearson
value: 88.33474590219448
- type: manhattan_spearman
value: 89.8548364866583
- task:
type: Retrieval
dataset:
name: MTEB DBPedia-PL
type: clarin-knext/dbpedia-pl
config: default
split: test
revision: 76afe41d9af165cc40999fcaa92312b8b012064a
metrics:
- type: map_at_1
value: 7.632999999999999
- type: map_at_10
value: 16.426
- type: map_at_100
value: 22.651
- type: map_at_1000
value: 24.372
- type: map_at_3
value: 11.706
- type: map_at_5
value: 13.529
- type: mrr_at_1
value: 60.75000000000001
- type: mrr_at_10
value: 68.613
- type: mrr_at_100
value: 69.001
- type: mrr_at_1000
value: 69.021
- type: mrr_at_3
value: 67.0
- type: mrr_at_5
value: 67.925
- type: ndcg_at_1
value: 49.875
- type: ndcg_at_10
value: 36.978
- type: ndcg_at_100
value: 40.031
- type: ndcg_at_1000
value: 47.566
- type: ndcg_at_3
value: 41.148
- type: ndcg_at_5
value: 38.702
- type: precision_at_1
value: 60.75000000000001
- type: precision_at_10
value: 29.7
- type: precision_at_100
value: 9.278
- type: precision_at_1000
value: 2.099
- type: precision_at_3
value: 44.0
- type: precision_at_5
value: 37.6
- type: recall_at_1
value: 7.632999999999999
- type: recall_at_10
value: 22.040000000000003
- type: recall_at_100
value: 44.024
- type: recall_at_1000
value: 67.848
- type: recall_at_3
value: 13.093
- type: recall_at_5
value: 15.973
- task:
type: Retrieval
dataset:
name: MTEB FiQA-PL
type: clarin-knext/fiqa-pl
config: default
split: test
revision: 2e535829717f8bf9dc829b7f911cc5bbd4e6608e
metrics:
- type: map_at_1
value: 15.473
- type: map_at_10
value: 24.579
- type: map_at_100
value: 26.387
- type: map_at_1000
value: 26.57
- type: map_at_3
value: 21.278
- type: map_at_5
value: 23.179
- type: mrr_at_1
value: 30.709999999999997
- type: mrr_at_10
value: 38.994
- type: mrr_at_100
value: 39.993
- type: mrr_at_1000
value: 40.044999999999995
- type: mrr_at_3
value: 36.342999999999996
- type: mrr_at_5
value: 37.846999999999994
- type: ndcg_at_1
value: 30.709999999999997
- type: ndcg_at_10
value: 31.608999999999998
- type: ndcg_at_100
value: 38.807
- type: ndcg_at_1000
value: 42.208
- type: ndcg_at_3
value: 28.086
- type: ndcg_at_5
value: 29.323
- type: precision_at_1
value: 30.709999999999997
- type: precision_at_10
value: 8.688
- type: precision_at_100
value: 1.608
- type: precision_at_1000
value: 0.22100000000000003
- type: precision_at_3
value: 18.724
- type: precision_at_5
value: 13.950999999999999
- type: recall_at_1
value: 15.473
- type: recall_at_10
value: 38.361000000000004
- type: recall_at_100
value: 65.2
- type: recall_at_1000
value: 85.789
- type: recall_at_3
value: 25.401
- type: recall_at_5
value: 30.875999999999998
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA-PL
type: clarin-knext/hotpotqa-pl
config: default
split: test
revision: a0bd479ac97b4ccb5bd6ce320c415d0bb4beb907
metrics:
- type: map_at_1
value: 38.096000000000004
- type: map_at_10
value: 51.44499999999999
- type: map_at_100
value: 52.325
- type: map_at_1000
value: 52.397000000000006
- type: map_at_3
value: 48.626999999999995
- type: map_at_5
value: 50.342
- type: mrr_at_1
value: 76.19200000000001
- type: mrr_at_10
value: 81.191
- type: mrr_at_100
value: 81.431
- type: mrr_at_1000
value: 81.443
- type: mrr_at_3
value: 80.30199999999999
- type: mrr_at_5
value: 80.85900000000001
- type: ndcg_at_1
value: 76.19200000000001
- type: ndcg_at_10
value: 60.9
- type: ndcg_at_100
value: 64.14699999999999
- type: ndcg_at_1000
value: 65.647
- type: ndcg_at_3
value: 56.818000000000005
- type: ndcg_at_5
value: 59.019999999999996
- type: precision_at_1
value: 76.19200000000001
- type: precision_at_10
value: 12.203
- type: precision_at_100
value: 1.478
- type: precision_at_1000
value: 0.168
- type: precision_at_3
value: 34.616
- type: precision_at_5
value: 22.515
- type: recall_at_1
value: 38.096000000000004
- type: recall_at_10
value: 61.013
- type: recall_at_100
value: 73.90299999999999
- type: recall_at_1000
value: 83.91
- type: recall_at_3
value: 51.92400000000001
- type: recall_at_5
value: 56.286
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO-PL
type: clarin-knext/msmarco-pl
config: default
split: test
revision: 8634c07806d5cce3a6138e260e59b81760a0a640
metrics:
- type: map_at_1
value: 1.548
- type: map_at_10
value: 11.049000000000001
- type: map_at_100
value: 28.874
- type: map_at_1000
value: 34.931
- type: map_at_3
value: 4.162
- type: map_at_5
value: 6.396
- type: mrr_at_1
value: 90.69800000000001
- type: mrr_at_10
value: 92.093
- type: mrr_at_100
value: 92.345
- type: mrr_at_1000
value: 92.345
- type: mrr_at_3
value: 91.86
- type: mrr_at_5
value: 91.86
- type: ndcg_at_1
value: 74.031
- type: ndcg_at_10
value: 63.978
- type: ndcg_at_100
value: 53.101
- type: ndcg_at_1000
value: 60.675999999999995
- type: ndcg_at_3
value: 71.421
- type: ndcg_at_5
value: 68.098
- type: precision_at_1
value: 90.69800000000001
- type: precision_at_10
value: 71.86
- type: precision_at_100
value: 31.395
- type: precision_at_1000
value: 5.981
- type: precision_at_3
value: 84.49600000000001
- type: precision_at_5
value: 79.07
- type: recall_at_1
value: 1.548
- type: recall_at_10
value: 12.149000000000001
- type: recall_at_100
value: 40.794999999999995
- type: recall_at_1000
value: 67.974
- type: recall_at_3
value: 4.244
- type: recall_at_5
value: 6.608
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (pl)
type: mteb/amazon_massive_intent
config: pl
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 73.55413584398119
- type: f1
value: 69.65610882318181
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (pl)
type: mteb/amazon_massive_scenario
config: pl
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 76.37188971082716
- type: f1
value: 75.64847309941361
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus-PL
type: clarin-knext/nfcorpus-pl
config: default
split: test
revision: 9a6f9567fda928260afed2de480d79c98bf0bec0
metrics:
- type: map_at_1
value: 4.919
- type: map_at_10
value: 10.834000000000001
- type: map_at_100
value: 13.38
- type: map_at_1000
value: 14.581
- type: map_at_3
value: 8.198
- type: map_at_5
value: 9.428
- type: mrr_at_1
value: 41.176
- type: mrr_at_10
value: 50.083
- type: mrr_at_100
value: 50.559
- type: mrr_at_1000
value: 50.604000000000006
- type: mrr_at_3
value: 47.936
- type: mrr_at_5
value: 49.407000000000004
- type: ndcg_at_1
value: 39.628
- type: ndcg_at_10
value: 30.098000000000003
- type: ndcg_at_100
value: 27.061
- type: ndcg_at_1000
value: 35.94
- type: ndcg_at_3
value: 35.135
- type: ndcg_at_5
value: 33.335
- type: precision_at_1
value: 41.176
- type: precision_at_10
value: 22.259999999999998
- type: precision_at_100
value: 6.712
- type: precision_at_1000
value: 1.9060000000000001
- type: precision_at_3
value: 33.23
- type: precision_at_5
value: 29.04
- type: recall_at_1
value: 4.919
- type: recall_at_10
value: 14.196
- type: recall_at_100
value: 26.948
- type: recall_at_1000
value: 59.211000000000006
- type: recall_at_3
value: 9.44
- type: recall_at_5
value: 11.569
- task:
type: Retrieval
dataset:
name: MTEB NQ-PL
type: clarin-knext/nq-pl
config: default
split: test
revision: f171245712cf85dd4700b06bef18001578d0ca8d
metrics:
- type: map_at_1
value: 25.35
- type: map_at_10
value: 37.884
- type: map_at_100
value: 38.955
- type: map_at_1000
value: 39.007999999999996
- type: map_at_3
value: 34.239999999999995
- type: map_at_5
value: 36.398
- type: mrr_at_1
value: 28.737000000000002
- type: mrr_at_10
value: 39.973
- type: mrr_at_100
value: 40.844
- type: mrr_at_1000
value: 40.885
- type: mrr_at_3
value: 36.901
- type: mrr_at_5
value: 38.721
- type: ndcg_at_1
value: 28.708
- type: ndcg_at_10
value: 44.204
- type: ndcg_at_100
value: 48.978
- type: ndcg_at_1000
value: 50.33
- type: ndcg_at_3
value: 37.36
- type: ndcg_at_5
value: 40.912
- type: precision_at_1
value: 28.708
- type: precision_at_10
value: 7.367
- type: precision_at_100
value: 1.0030000000000001
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 17.034
- type: precision_at_5
value: 12.293999999999999
- type: recall_at_1
value: 25.35
- type: recall_at_10
value: 61.411
- type: recall_at_100
value: 82.599
- type: recall_at_1000
value: 92.903
- type: recall_at_3
value: 43.728
- type: recall_at_5
value: 51.854
- task:
type: Classification
dataset:
name: MTEB PAC
type: laugustyniak/abusive-clauses-pl
config: default
split: test
revision: None
metrics:
- type: accuracy
value: 69.04141326382856
- type: ap
value: 77.49422763833996
- type: f1
value: 66.73472657783407
- task:
type: PairClassification
dataset:
name: MTEB PPC
type: PL-MTEB/ppc-pairclassification
config: default
split: test
revision: None
metrics:
- type: cos_sim_accuracy
value: 81.0
- type: cos_sim_ap
value: 91.47194213011349
- type: cos_sim_f1
value: 84.73767885532592
- type: cos_sim_precision
value: 81.49847094801224
- type: cos_sim_recall
value: 88.24503311258279
- type: dot_accuracy
value: 81.0
- type: dot_ap
value: 91.47194213011349
- type: dot_f1
value: 84.73767885532592
- type: dot_precision
value: 81.49847094801224
- type: dot_recall
value: 88.24503311258279
- type: euclidean_accuracy
value: 81.0
- type: euclidean_ap
value: 91.47194213011349
- type: euclidean_f1
value: 84.73767885532592
- type: euclidean_precision
value: 81.49847094801224
- type: euclidean_recall
value: 88.24503311258279
- type: manhattan_accuracy
value: 81.0
- type: manhattan_ap
value: 91.46464475050571
- type: manhattan_f1
value: 84.48687350835321
- type: manhattan_precision
value: 81.31699846860643
- type: manhattan_recall
value: 87.91390728476821
- type: max_accuracy
value: 81.0
- type: max_ap
value: 91.47194213011349
- type: max_f1
value: 84.73767885532592
- task:
type: PairClassification
dataset:
name: MTEB PSC
type: PL-MTEB/psc-pairclassification
config: default
split: test
revision: None
metrics:
- type: cos_sim_accuracy
value: 97.6808905380334
- type: cos_sim_ap
value: 99.27948611836348
- type: cos_sim_f1
value: 96.15975422427034
- type: cos_sim_precision
value: 96.90402476780186
- type: cos_sim_recall
value: 95.42682926829268
- type: dot_accuracy
value: 97.6808905380334
- type: dot_ap
value: 99.2794861183635
- type: dot_f1
value: 96.15975422427034
- type: dot_precision
value: 96.90402476780186
- type: dot_recall
value: 95.42682926829268
- type: euclidean_accuracy
value: 97.6808905380334
- type: euclidean_ap
value: 99.2794861183635
- type: euclidean_f1
value: 96.15975422427034
- type: euclidean_precision
value: 96.90402476780186
- type: euclidean_recall
value: 95.42682926829268
- type: manhattan_accuracy
value: 97.6808905380334
- type: manhattan_ap
value: 99.28715055268721
- type: manhattan_f1
value: 96.14791987673343
- type: manhattan_precision
value: 97.19626168224299
- type: manhattan_recall
value: 95.1219512195122
- type: max_accuracy
value: 97.6808905380334
- type: max_ap
value: 99.28715055268721
- type: max_f1
value: 96.15975422427034
- task:
type: Classification
dataset:
name: MTEB PolEmo2.0-IN
type: PL-MTEB/polemo2_in
config: default
split: test
revision: None
metrics:
- type: accuracy
value: 86.16343490304708
- type: f1
value: 83.3442579486744
- task:
type: Classification
dataset:
name: MTEB PolEmo2.0-OUT
type: PL-MTEB/polemo2_out
config: default
split: test
revision: None
metrics:
- type: accuracy
value: 68.40080971659918
- type: f1
value: 53.13720751142237
- task:
type: Retrieval
dataset:
name: MTEB Quora-PL
type: clarin-knext/quora-pl
config: default
split: test
revision: 0be27e93455051e531182b85e85e425aba12e9d4
metrics:
- type: map_at_1
value: 63.322
- type: map_at_10
value: 76.847
- type: map_at_100
value: 77.616
- type: map_at_1000
value: 77.644
- type: map_at_3
value: 73.624
- type: map_at_5
value: 75.603
- type: mrr_at_1
value: 72.88
- type: mrr_at_10
value: 80.376
- type: mrr_at_100
value: 80.604
- type: mrr_at_1000
value: 80.61
- type: mrr_at_3
value: 78.92
- type: mrr_at_5
value: 79.869
- type: ndcg_at_1
value: 72.89999999999999
- type: ndcg_at_10
value: 81.43
- type: ndcg_at_100
value: 83.394
- type: ndcg_at_1000
value: 83.685
- type: ndcg_at_3
value: 77.62599999999999
- type: ndcg_at_5
value: 79.656
- type: precision_at_1
value: 72.89999999999999
- type: precision_at_10
value: 12.548
- type: precision_at_100
value: 1.4869999999999999
- type: precision_at_1000
value: 0.155
- type: precision_at_3
value: 34.027
- type: precision_at_5
value: 22.654
- type: recall_at_1
value: 63.322
- type: recall_at_10
value: 90.664
- type: recall_at_100
value: 97.974
- type: recall_at_1000
value: 99.636
- type: recall_at_3
value: 80.067
- type: recall_at_5
value: 85.526
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS-PL
type: clarin-knext/scidocs-pl
config: default
split: test
revision: 45452b03f05560207ef19149545f168e596c9337
metrics:
- type: map_at_1
value: 3.95
- type: map_at_10
value: 9.658999999999999
- type: map_at_100
value: 11.384
- type: map_at_1000
value: 11.677
- type: map_at_3
value: 7.055
- type: map_at_5
value: 8.244
- type: mrr_at_1
value: 19.5
- type: mrr_at_10
value: 28.777
- type: mrr_at_100
value: 29.936
- type: mrr_at_1000
value: 30.009999999999998
- type: mrr_at_3
value: 25.55
- type: mrr_at_5
value: 27.284999999999997
- type: ndcg_at_1
value: 19.5
- type: ndcg_at_10
value: 16.589000000000002
- type: ndcg_at_100
value: 23.879
- type: ndcg_at_1000
value: 29.279
- type: ndcg_at_3
value: 15.719
- type: ndcg_at_5
value: 13.572000000000001
- type: precision_at_1
value: 19.5
- type: precision_at_10
value: 8.62
- type: precision_at_100
value: 1.924
- type: precision_at_1000
value: 0.322
- type: precision_at_3
value: 14.6
- type: precision_at_5
value: 11.78
- type: recall_at_1
value: 3.95
- type: recall_at_10
value: 17.477999999999998
- type: recall_at_100
value: 38.99
- type: recall_at_1000
value: 65.417
- type: recall_at_3
value: 8.883000000000001
- type: recall_at_5
value: 11.933
- task:
type: PairClassification
dataset:
name: MTEB SICK-E-PL
type: PL-MTEB/sicke-pl-pairclassification
config: default
split: test
revision: None
metrics:
- type: cos_sim_accuracy
value: 83.48960456583775
- type: cos_sim_ap
value: 76.31522115825375
- type: cos_sim_f1
value: 70.35573122529645
- type: cos_sim_precision
value: 70.9934735315446
- type: cos_sim_recall
value: 69.72934472934473
- type: dot_accuracy
value: 83.48960456583775
- type: dot_ap
value: 76.31522115825373
- type: dot_f1
value: 70.35573122529645
- type: dot_precision
value: 70.9934735315446
- type: dot_recall
value: 69.72934472934473
- type: euclidean_accuracy
value: 83.48960456583775
- type: euclidean_ap
value: 76.31522115825373
- type: euclidean_f1
value: 70.35573122529645
- type: euclidean_precision
value: 70.9934735315446
- type: euclidean_recall
value: 69.72934472934473
- type: manhattan_accuracy
value: 83.46922136159804
- type: manhattan_ap
value: 76.18474601388084
- type: manhattan_f1
value: 70.34779490856937
- type: manhattan_precision
value: 70.83032490974729
- type: manhattan_recall
value: 69.87179487179486
- type: max_accuracy
value: 83.48960456583775
- type: max_ap
value: 76.31522115825375
- type: max_f1
value: 70.35573122529645
- task:
type: STS
dataset:
name: MTEB SICK-R-PL
type: PL-MTEB/sickr-pl-sts
config: default
split: test
revision: None
metrics:
- type: cos_sim_pearson
value: 77.95374883876302
- type: cos_sim_spearman
value: 73.77630219171942
- type: euclidean_pearson
value: 75.81927069594934
- type: euclidean_spearman
value: 73.7763211303831
- type: manhattan_pearson
value: 76.03126859057528
- type: manhattan_spearman
value: 73.96528138013369
- task:
type: STS
dataset:
name: MTEB STS22 (pl)
type: mteb/sts22-crosslingual-sts
config: pl
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 37.388282764841826
- type: cos_sim_spearman
value: 40.83477184710897
- type: euclidean_pearson
value: 26.754737044177805
- type: euclidean_spearman
value: 40.83477184710897
- type: manhattan_pearson
value: 26.760453110872458
- type: manhattan_spearman
value: 41.034477441383856
- task:
type: Retrieval
dataset:
name: MTEB SciFact-PL
type: clarin-knext/scifact-pl
config: default
split: test
revision: 47932a35f045ef8ed01ba82bf9ff67f6e109207e
metrics:
- type: map_at_1
value: 49.15
- type: map_at_10
value: 61.690999999999995
- type: map_at_100
value: 62.348000000000006
- type: map_at_1000
value: 62.38
- type: map_at_3
value: 58.824
- type: map_at_5
value: 60.662000000000006
- type: mrr_at_1
value: 51.333
- type: mrr_at_10
value: 62.731
- type: mrr_at_100
value: 63.245
- type: mrr_at_1000
value: 63.275000000000006
- type: mrr_at_3
value: 60.667
- type: mrr_at_5
value: 61.93300000000001
- type: ndcg_at_1
value: 51.333
- type: ndcg_at_10
value: 67.168
- type: ndcg_at_100
value: 69.833
- type: ndcg_at_1000
value: 70.56700000000001
- type: ndcg_at_3
value: 62.40599999999999
- type: ndcg_at_5
value: 65.029
- type: precision_at_1
value: 51.333
- type: precision_at_10
value: 9.333
- type: precision_at_100
value: 1.0699999999999998
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 25.333
- type: precision_at_5
value: 17.067
- type: recall_at_1
value: 49.15
- type: recall_at_10
value: 82.533
- type: recall_at_100
value: 94.167
- type: recall_at_1000
value: 99.667
- type: recall_at_3
value: 69.917
- type: recall_at_5
value: 76.356
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID-PL
type: clarin-knext/trec-covid-pl
config: default
split: test
revision: 81bcb408f33366c2a20ac54adafad1ae7e877fdd
metrics:
- type: map_at_1
value: 0.261
- type: map_at_10
value: 2.1260000000000003
- type: map_at_100
value: 12.171999999999999
- type: map_at_1000
value: 26.884999999999998
- type: map_at_3
value: 0.695
- type: map_at_5
value: 1.134
- type: mrr_at_1
value: 96.0
- type: mrr_at_10
value: 96.952
- type: mrr_at_100
value: 96.952
- type: mrr_at_1000
value: 96.952
- type: mrr_at_3
value: 96.667
- type: mrr_at_5
value: 96.667
- type: ndcg_at_1
value: 92.0
- type: ndcg_at_10
value: 81.193
- type: ndcg_at_100
value: 61.129
- type: ndcg_at_1000
value: 51.157
- type: ndcg_at_3
value: 85.693
- type: ndcg_at_5
value: 84.129
- type: precision_at_1
value: 96.0
- type: precision_at_10
value: 85.39999999999999
- type: precision_at_100
value: 62.03999999999999
- type: precision_at_1000
value: 22.224
- type: precision_at_3
value: 88.0
- type: precision_at_5
value: 88.0
- type: recall_at_1
value: 0.261
- type: recall_at_10
value: 2.262
- type: recall_at_100
value: 14.981
- type: recall_at_1000
value: 46.837
- type: recall_at_3
value: 0.703
- type: recall_at_5
value: 1.172
- task:
type: Clustering
dataset:
name: MTEB AlloProfClusteringP2P
type: lyon-nlp/alloprof
config: default
split: test
revision: 392ba3f5bcc8c51f578786c1fc3dae648662cb9b
metrics:
- type: v_measure
value: 70.55290063940157
- type: v_measure
value: 55.41500719337263
- task:
type: Reranking
dataset:
name: MTEB AlloprofReranking
type: lyon-nlp/mteb-fr-reranking-alloprof-s2p
config: default
split: test
revision: 666fdacebe0291776e86f29345663dfaf80a0db9
metrics:
- type: map
value: 73.48697375332002
- type: mrr
value: 75.01836585523822
- task:
type: Retrieval
dataset:
name: MTEB AlloprofRetrieval
type: lyon-nlp/alloprof
config: default
split: test
revision: 392ba3f5bcc8c51f578786c1fc3dae648662cb9b
metrics:
- type: map_at_1
value: 38.454
- type: map_at_10
value: 51.605000000000004
- type: map_at_100
value: 52.653000000000006
- type: map_at_1000
value: 52.697
- type: map_at_3
value: 48.304
- type: map_at_5
value: 50.073
- type: mrr_at_1
value: 43.307
- type: mrr_at_10
value: 54.400000000000006
- type: mrr_at_100
value: 55.147999999999996
- type: mrr_at_1000
value: 55.174
- type: mrr_at_3
value: 51.77
- type: mrr_at_5
value: 53.166999999999994
- type: ndcg_at_1
value: 43.307
- type: ndcg_at_10
value: 57.891000000000005
- type: ndcg_at_100
value: 62.161
- type: ndcg_at_1000
value: 63.083
- type: ndcg_at_3
value: 51.851
- type: ndcg_at_5
value: 54.605000000000004
- type: precision_at_1
value: 43.307
- type: precision_at_10
value: 9.033
- type: precision_at_100
value: 1.172
- type: precision_at_1000
value: 0.127
- type: precision_at_3
value: 22.798
- type: precision_at_5
value: 15.492
- type: recall_at_1
value: 38.454
- type: recall_at_10
value: 74.166
- type: recall_at_100
value: 92.43599999999999
- type: recall_at_1000
value: 99.071
- type: recall_at_3
value: 58.087
- type: recall_at_5
value: 64.568
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (fr)
type: mteb/amazon_reviews_multi
config: fr
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 53.474
- type: f1
value: 50.38275392350236
- task:
type: Retrieval
dataset:
name: MTEB BSARDRetrieval
type: maastrichtlawtech/bsard
config: default
split: test
revision: 5effa1b9b5fa3b0f9e12523e6e43e5f86a6e6d59
metrics:
- type: map_at_1
value: 2.252
- type: map_at_10
value: 4.661
- type: map_at_100
value: 5.271
- type: map_at_1000
value: 5.3629999999999995
- type: map_at_3
value: 3.604
- type: map_at_5
value: 4.3020000000000005
- type: mrr_at_1
value: 2.252
- type: mrr_at_10
value: 4.661
- type: mrr_at_100
value: 5.271
- type: mrr_at_1000
value: 5.3629999999999995
- type: mrr_at_3
value: 3.604
- type: mrr_at_5
value: 4.3020000000000005
- type: ndcg_at_1
value: 2.252
- type: ndcg_at_10
value: 6.3020000000000005
- type: ndcg_at_100
value: 10.342
- type: ndcg_at_1000
value: 13.475999999999999
- type: ndcg_at_3
value: 4.0649999999999995
- type: ndcg_at_5
value: 5.344
- type: precision_at_1
value: 2.252
- type: precision_at_10
value: 1.171
- type: precision_at_100
value: 0.333
- type: precision_at_1000
value: 0.059000000000000004
- type: precision_at_3
value: 1.802
- type: precision_at_5
value: 1.712
- type: recall_at_1
value: 2.252
- type: recall_at_10
value: 11.712
- type: recall_at_100
value: 33.333
- type: recall_at_1000
value: 59.458999999999996
- type: recall_at_3
value: 5.405
- type: recall_at_5
value: 8.559
- task:
type: Clustering
dataset:
name: MTEB HALClusteringS2S
type: lyon-nlp/clustering-hal-s2s
config: default
split: test
revision: e06ebbbb123f8144bef1a5d18796f3dec9ae2915
metrics:
- type: v_measure
value: 28.301882091023288
- task:
type: Clustering
dataset:
name: MTEB MLSUMClusteringP2P
type: mlsum
config: default
split: test
revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7
metrics:
- type: v_measure
value: 45.26992995191701
- type: v_measure
value: 42.773174876871145
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (fr)
type: mteb/mtop_domain
config: fr
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 93.47635452552458
- type: f1
value: 93.19922617577213
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (fr)
type: mteb/mtop_intent
config: fr
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 80.2317569683683
- type: f1
value: 56.18060418621901
- task:
type: Classification
dataset:
name: MTEB MasakhaNEWSClassification (fra)
type: masakhane/masakhanews
config: fra
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: accuracy
value: 85.18957345971565
- type: f1
value: 80.829981537394
- task:
type: Clustering
dataset:
name: MTEB MasakhaNEWSClusteringP2P (fra)
type: masakhane/masakhanews
config: fra
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: v_measure
value: 71.04138999801822
- type: v_measure
value: 71.7056263158008
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (fr)
type: mteb/amazon_massive_intent
config: fr
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 76.65097511768661
- type: f1
value: 73.82441070598712
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (fr)
type: mteb/amazon_massive_scenario
config: fr
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 79.09885675857431
- type: f1
value: 78.28407777434224
- task:
type: Retrieval
dataset:
name: MTEB MintakaRetrieval (fr)
type: jinaai/mintakaqa
config: fr
split: test
revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e
metrics:
- type: map_at_1
value: 25.307000000000002
- type: map_at_10
value: 36.723
- type: map_at_100
value: 37.713
- type: map_at_1000
value: 37.769000000000005
- type: map_at_3
value: 33.77
- type: map_at_5
value: 35.463
- type: mrr_at_1
value: 25.307000000000002
- type: mrr_at_10
value: 36.723
- type: mrr_at_100
value: 37.713
- type: mrr_at_1000
value: 37.769000000000005
- type: mrr_at_3
value: 33.77
- type: mrr_at_5
value: 35.463
- type: ndcg_at_1
value: 25.307000000000002
- type: ndcg_at_10
value: 42.559999999999995
- type: ndcg_at_100
value: 47.457
- type: ndcg_at_1000
value: 49.162
- type: ndcg_at_3
value: 36.461
- type: ndcg_at_5
value: 39.504
- type: precision_at_1
value: 25.307000000000002
- type: precision_at_10
value: 6.106
- type: precision_at_100
value: 0.8420000000000001
- type: precision_at_1000
value: 0.098
- type: precision_at_3
value: 14.741999999999999
- type: precision_at_5
value: 10.319
- type: recall_at_1
value: 25.307000000000002
- type: recall_at_10
value: 61.056999999999995
- type: recall_at_100
value: 84.152
- type: recall_at_1000
value: 98.03399999999999
- type: recall_at_3
value: 44.226
- type: recall_at_5
value: 51.597
- task:
type: PairClassification
dataset:
name: MTEB OpusparcusPC (fr)
type: GEM/opusparcus
config: fr
split: test
revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a
metrics:
- type: cos_sim_accuracy
value: 99.90069513406156
- type: cos_sim_ap
value: 100.0
- type: cos_sim_f1
value: 99.95032290114257
- type: cos_sim_precision
value: 100.0
- type: cos_sim_recall
value: 99.90069513406156
- type: dot_accuracy
value: 99.90069513406156
- type: dot_ap
value: 100.0
- type: dot_f1
value: 99.95032290114257
- type: dot_precision
value: 100.0
- type: dot_recall
value: 99.90069513406156
- type: euclidean_accuracy
value: 99.90069513406156
- type: euclidean_ap
value: 100.0
- type: euclidean_f1
value: 99.95032290114257
- type: euclidean_precision
value: 100.0
- type: euclidean_recall
value: 99.90069513406156
- type: manhattan_accuracy
value: 99.90069513406156
- type: manhattan_ap
value: 100.0
- type: manhattan_f1
value: 99.95032290114257
- type: manhattan_precision
value: 100.0
- type: manhattan_recall
value: 99.90069513406156
- type: max_accuracy
value: 99.90069513406156
- type: max_ap
value: 100.0
- type: max_f1
value: 99.95032290114257
- task:
type: PairClassification
dataset:
name: MTEB PawsX (fr)
type: paws-x
config: fr
split: test
revision: 8a04d940a42cd40658986fdd8e3da561533a3646
metrics:
- type: cos_sim_accuracy
value: 70.8
- type: cos_sim_ap
value: 73.7671529695957
- type: cos_sim_f1
value: 68.80964339527875
- type: cos_sim_precision
value: 62.95955882352941
- type: cos_sim_recall
value: 75.85825027685493
- type: dot_accuracy
value: 70.8
- type: dot_ap
value: 73.78345265366947
- type: dot_f1
value: 68.80964339527875
- type: dot_precision
value: 62.95955882352941
- type: dot_recall
value: 75.85825027685493
- type: euclidean_accuracy
value: 70.8
- type: euclidean_ap
value: 73.7671529695957
- type: euclidean_f1
value: 68.80964339527875
- type: euclidean_precision
value: 62.95955882352941
- type: euclidean_recall
value: 75.85825027685493
- type: manhattan_accuracy
value: 70.75
- type: manhattan_ap
value: 73.78996383615953
- type: manhattan_f1
value: 68.79432624113475
- type: manhattan_precision
value: 63.39869281045751
- type: manhattan_recall
value: 75.1937984496124
- type: max_accuracy
value: 70.8
- type: max_ap
value: 73.78996383615953
- type: max_f1
value: 68.80964339527875
- task:
type: STS
dataset:
name: MTEB SICKFr
type: Lajavaness/SICK-fr
config: default
split: test
revision: e077ab4cf4774a1e36d86d593b150422fafd8e8a
metrics:
- type: cos_sim_pearson
value: 84.03253762760392
- type: cos_sim_spearman
value: 79.68280105762004
- type: euclidean_pearson
value: 80.98265050044444
- type: euclidean_spearman
value: 79.68233242682867
- type: manhattan_pearson
value: 80.9678911810704
- type: manhattan_spearman
value: 79.70264097683109
- task:
type: STS
dataset:
name: MTEB STS22 (fr)
type: mteb/sts22-crosslingual-sts
config: fr
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 80.56896987572884
- type: cos_sim_spearman
value: 81.84352499523287
- type: euclidean_pearson
value: 80.40831759421305
- type: euclidean_spearman
value: 81.84352499523287
- type: manhattan_pearson
value: 80.74333857561238
- type: manhattan_spearman
value: 82.41503246733892
- task:
type: STS
dataset:
name: MTEB STSBenchmarkMultilingualSTS (fr)
type: stsb_multi_mt
config: fr
split: test
revision: 93d57ef91790589e3ce9c365164337a8a78b7632
metrics:
- type: cos_sim_pearson
value: 82.71826762276979
- type: cos_sim_spearman
value: 82.25433354916042
- type: euclidean_pearson
value: 81.87115571724316
- type: euclidean_spearman
value: 82.25322342890107
- type: manhattan_pearson
value: 82.11174867527224
- type: manhattan_spearman
value: 82.55905365203084
- task:
type: Summarization
dataset:
name: MTEB SummEvalFr
type: lyon-nlp/summarization-summeval-fr-p2p
config: default
split: test
revision: b385812de6a9577b6f4d0f88c6a6e35395a94054
metrics:
- type: cos_sim_pearson
value: 30.659441623392887
- type: cos_sim_spearman
value: 30.501134097353315
- type: dot_pearson
value: 30.659444768851056
- type: dot_spearman
value: 30.501134097353315
- task:
type: Reranking
dataset:
name: MTEB SyntecReranking
type: lyon-nlp/mteb-fr-reranking-syntec-s2p
config: default
split: test
revision: b205c5084a0934ce8af14338bf03feb19499c84d
metrics:
- type: map
value: 94.03333333333333
- type: mrr
value: 94.03333333333333
- task:
type: Retrieval
dataset:
name: MTEB SyntecRetrieval
type: lyon-nlp/mteb-fr-retrieval-syntec-s2p
config: default
split: test
revision: 77f7e271bf4a92b24fce5119f3486b583ca016ff
metrics:
- type: map_at_1
value: 79.0
- type: map_at_10
value: 87.61
- type: map_at_100
value: 87.655
- type: map_at_1000
value: 87.655
- type: map_at_3
value: 87.167
- type: map_at_5
value: 87.36699999999999
- type: mrr_at_1
value: 79.0
- type: mrr_at_10
value: 87.61
- type: mrr_at_100
value: 87.655
- type: mrr_at_1000
value: 87.655
- type: mrr_at_3
value: 87.167
- type: mrr_at_5
value: 87.36699999999999
- type: ndcg_at_1
value: 79.0
- type: ndcg_at_10
value: 90.473
- type: ndcg_at_100
value: 90.694
- type: ndcg_at_1000
value: 90.694
- type: ndcg_at_3
value: 89.464
- type: ndcg_at_5
value: 89.851
- type: precision_at_1
value: 79.0
- type: precision_at_10
value: 9.9
- type: precision_at_100
value: 1.0
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 32.0
- type: precision_at_5
value: 19.400000000000002
- type: recall_at_1
value: 79.0
- type: recall_at_10
value: 99.0
- type: recall_at_100
value: 100.0
- type: recall_at_1000
value: 100.0
- type: recall_at_3
value: 96.0
- type: recall_at_5
value: 97.0
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (fr)
type: jinaai/xpqa
config: fr
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: map_at_1
value: 39.395
- type: map_at_10
value: 59.123999999999995
- type: map_at_100
value: 60.704
- type: map_at_1000
value: 60.760000000000005
- type: map_at_3
value: 53.187
- type: map_at_5
value: 56.863
- type: mrr_at_1
value: 62.083
- type: mrr_at_10
value: 68.87299999999999
- type: mrr_at_100
value: 69.46900000000001
- type: mrr_at_1000
value: 69.48299999999999
- type: mrr_at_3
value: 66.8
- type: mrr_at_5
value: 67.928
- type: ndcg_at_1
value: 62.083
- type: ndcg_at_10
value: 65.583
- type: ndcg_at_100
value: 70.918
- type: ndcg_at_1000
value: 71.72800000000001
- type: ndcg_at_3
value: 60.428000000000004
- type: ndcg_at_5
value: 61.853
- type: precision_at_1
value: 62.083
- type: precision_at_10
value: 15.033
- type: precision_at_100
value: 1.9529999999999998
- type: precision_at_1000
value: 0.207
- type: precision_at_3
value: 36.315
- type: precision_at_5
value: 25.955000000000002
- type: recall_at_1
value: 39.395
- type: recall_at_10
value: 74.332
- type: recall_at_100
value: 94.729
- type: recall_at_1000
value: 99.75500000000001
- type: recall_at_3
value: 57.679
- type: recall_at_5
value: 65.036
---
## gte-Qwen2-1.5B-instruct
**gte-Qwen2-1.5B-instruct** is the latest model in the gte (General Text Embedding) model family. The model is built on [Qwen2-1.5B](https://huggingface.co/Qwen/Qwen2-1.5B) LLM model and use the same training data and strategies as the [gte-Qwen2-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) model.
The model incorporates several key advancements:
- Integration of bidirectional attention mechanisms, enriching its contextual understanding.
- Instruction tuning, applied solely on the query side for streamlined efficiency
- Comprehensive training across a vast, multilingual text corpus spanning diverse domains and scenarios. This training leverages both weakly supervised and supervised data, ensuring the model's applicability across numerous languages and a wide array of downstream tasks.
## Model Information
- Model Size: 1.5B
- Embedding Dimension: 1536
- Max Input Tokens: 32k
## Requirements
```
transformers>=4.39.2
flash_attn>=2.5.6
```
## Usage
### Sentence Transformers
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer("Alibaba-NLP/gte-Qwen2-1.5B-instruct", trust_remote_code=True)
# In case you want to reduce the maximum length:
model.max_seq_length = 8192
queries = [
"how much protein should a female eat",
"summit define",
]
documents = [
"As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments.",
]
query_embeddings = model.encode(queries, prompt_name="query")
document_embeddings = model.encode(documents)
scores = (query_embeddings @ document_embeddings.T) * 100
print(scores.tolist())
```
Observe the [config_sentence_transformers.json](config_sentence_transformers.json) to see all pre-built prompt names. Otherwise, you can use `model.encode(queries, prompt="Instruct: ...\nQuery: "` to use a custom prompt of your choice.
### Transformers
```python
import torch
import torch.nn.functional as F
from torch import Tensor
from transformers import AutoTokenizer, AutoModel
def last_token_pool(last_hidden_states: Tensor,
attention_mask: Tensor) -> Tensor:
left_padding = (attention_mask[:, -1].sum() == attention_mask.shape[0])
if left_padding:
return last_hidden_states[:, -1]
else:
sequence_lengths = attention_mask.sum(dim=1) - 1
batch_size = last_hidden_states.shape[0]
return last_hidden_states[torch.arange(batch_size, device=last_hidden_states.device), sequence_lengths]
def get_detailed_instruct(task_description: str, query: str) -> str:
return f'Instruct: {task_description}\nQuery: {query}'
# Each query must come with a one-sentence instruction that describes the task
task = 'Given a web search query, retrieve relevant passages that answer the query'
queries = [
get_detailed_instruct(task, 'how much protein should a female eat'),
get_detailed_instruct(task, 'summit define')
]
# No need to add instruction for retrieval documents
documents = [
"As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."
]
input_texts = queries + documents
tokenizer = AutoTokenizer.from_pretrained('Alibaba-NLP/gte-Qwen2-1.5B-instruct', trust_remote_code=True)
model = AutoModel.from_pretrained('Alibaba-NLP/gte-Qwen2-1.5B-instruct', trust_remote_code=True)
max_length = 8192
# Tokenize the input texts
batch_dict = tokenizer(input_texts, max_length=max_length, padding=True, truncation=True, return_tensors='pt')
outputs = model(**batch_dict)
embeddings = last_token_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
# normalize embeddings
embeddings = F.normalize(embeddings, p=2, dim=1)
scores = (embeddings[:2] @ embeddings[2:].T) * 100
print(scores.tolist())
```
## Evaluation
### MTEB & C-MTEB
You can use the [scripts/eval_mteb.py](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct/blob/main/scripts/eval_mteb.py) to reproduce the following result of **gte-Qwen2-1.5B-instruct** on MTEB(English)/C-MTEB(Chinese):
| Model Name | MTEB(56) | C-MTEB(35) | MTEB-fr(26) | MTEB-pl(26) |
|:----:|:---------:|:----------:|:----------:|:----------:|
| [bge-base-en-1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 64.23 | - | - | - |
| [bge-large-en-1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 63.55 | - | - | - |
| [gte-large-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | 65.39 | - | - | - |
| [gte-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | 64.11 | - | - | - |
| [mxbai-embed-large-v1](https://huggingface.co/mixedbread-ai/mxbai-embed-large-v1) | 64.68 | - | - | - |
| [acge_text_embedding](https://huggingface.co/aspire/acge_text_embedding) | - | 69.07 | - | - |
| [stella-mrl-large-zh-v3.5-1792d](https://huggingface.co/infgrad/stella-mrl-large-zh-v3.5-1792d) | - | 68.55 | - | - |
| [gte-large-zh](https://huggingface.co/thenlper/gte-large-zh) | - | 66.72 | - | - |
| [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 59.45 | 56.21 | - | - |
| [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 61.50 | 58.81 | - | - |
| [e5-mistral-7b-instruct](https://huggingface.co/intfloat/e5-mistral-7b-instruct) | 66.63 | 60.81 | - | - |
| [gte-Qwen1.5-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen1.5-7B-instruct) | 67.34 | 69.52 | - | - |
| [NV-Embed-v1](https://huggingface.co/nvidia/NV-Embed-v1) | 69.32 | - | - | - |
| [**gte-Qwen2-7B-instruct**](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) | **70.24** | **72.05** | **68.25** | **67.86** |
| [**gte-Qwen2-1.5B-instruct**](https://huggingface.co/Alibaba-NLP/gte-Qwen2-1.5B-instruct) | **67.16** | **67.65** | **66.60** | **64.04** |
### GTE Models
The gte series models have consistently released two types of models: encoder-only models (based on the BERT architecture) and decode-only models (based on the LLM architecture).
| Models | Language | Max Sequence Length | Dimension | Model Size (Memory Usage, fp32) |
|:-------------------------------------------------------------------------------------:|:--------:|:-----: |:---------:|:-------------------------------:|
| [GTE-large-zh](https://huggingface.co/thenlper/gte-large-zh) | Chinese | 512 | 1024 | 1.25GB |
| [GTE-base-zh](https://huggingface.co/thenlper/gte-base-zh) | Chinese | 512 | 512 | 0.41GB |
| [GTE-small-zh](https://huggingface.co/thenlper/gte-small-zh) | Chinese | 512 | 512 | 0.12GB |
| [GTE-large](https://huggingface.co/thenlper/gte-large) | English | 512 | 1024 | 1.25GB |
| [GTE-base](https://huggingface.co/thenlper/gte-base) | English | 512 | 512 | 0.21GB |
| [GTE-small](https://huggingface.co/thenlper/gte-small) | English | 512 | 384 | 0.10GB |
| [GTE-large-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | English | 8192 | 1024 | 1.74GB |
| [GTE-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-base-en-v1.5) | English | 8192 | 768 | 0.51GB |
| [GTE-Qwen1.5-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen1.5-7B-instruct) | Multilingual | 32000 | 4096 | 26.45GB |
| [GTE-Qwen2-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) | Multilingual | 32000 | 3584 | 26.45GB |
| [GTE-Qwen2-1.5B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen2-1.5B-instruct) | Multilingual | 32000 | 1536 | 6.62GB |
## Cloud API Services
In addition to the open-source [GTE](https://huggingface.co/collections/Alibaba-NLP/gte-models-6680f0b13f885cb431e6d469) series models, GTE series models are also available as commercial API services on Alibaba Cloud.
- [Embedding Models](https://help.aliyun.com/zh/model-studio/developer-reference/general-text-embedding/): Rhree versions of the text embedding models are available: text-embedding-v1/v2/v3, with v3 being the latest API service.
- [ReRank Models](https://help.aliyun.com/zh/model-studio/developer-reference/general-text-sorting-model/): The gte-rerank model service is available.
Note that the models behind the commercial APIs are not entirely identical to the open-source models.
## Citation
If you find our paper or models helpful, please consider cite:
```
@article{li2023towards,
title={Towards general text embeddings with multi-stage contrastive learning},
author={Li, Zehan and Zhang, Xin and Zhang, Yanzhao and Long, Dingkun and Xie, Pengjun and Zhang, Meishan},
journal={arXiv preprint arXiv:2308.03281},
year={2023}
}
```
| [
"SUMMARIZATION"
]
| [
"BIOSSES",
"SCIFACT"
]
| Non_BioNLP |
## gte-Qwen2-1.5B-instruct
**gte-Qwen2-1.5B-instruct** is the latest model in the gte (General Text Embedding) model family. The model is built on [Qwen2-1.5B](https://huggingface.co/Qwen/Qwen2-1.5B) LLM model and use the same training data and strategies as the [gte-Qwen2-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) model.
The model incorporates several key advancements:
- Integration of bidirectional attention mechanisms, enriching its contextual understanding.
- Instruction tuning, applied solely on the query side for streamlined efficiency
- Comprehensive training across a vast, multilingual text corpus spanning diverse domains and scenarios. This training leverages both weakly supervised and supervised data, ensuring the model's applicability across numerous languages and a wide array of downstream tasks.
## Model Information
- Model Size: 1.5B
- Embedding Dimension: 1536
- Max Input Tokens: 32k
## Requirements
```
transformers>=4.39.2
flash_attn>=2.5.6
```
## Usage
### Sentence Transformers
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer("Alibaba-NLP/gte-Qwen2-1.5B-instruct", trust_remote_code=True)
# In case you want to reduce the maximum length:
model.max_seq_length = 8192
queries = [
"how much protein should a female eat",
"summit define",
]
documents = [
"As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments.",
]
query_embeddings = model.encode(queries, prompt_name="query")
document_embeddings = model.encode(documents)
scores = (query_embeddings @ document_embeddings.T) * 100
print(scores.tolist())
```
Observe the [config_sentence_transformers.json](config_sentence_transformers.json) to see all pre-built prompt names. Otherwise, you can use `model.encode(queries, prompt="Instruct: ...\nQuery: "` to use a custom prompt of your choice.
### Transformers
```python
import torch
import torch.nn.functional as F
from torch import Tensor
from transformers import AutoTokenizer, AutoModel
def last_token_pool(last_hidden_states: Tensor,
attention_mask: Tensor) -> Tensor:
left_padding = (attention_mask[:, -1].sum() == attention_mask.shape[0])
if left_padding:
return last_hidden_states[:, -1]
else:
sequence_lengths = attention_mask.sum(dim=1) - 1
batch_size = last_hidden_states.shape[0]
return last_hidden_states[torch.arange(batch_size, device=last_hidden_states.device), sequence_lengths]
def get_detailed_instruct(task_description: str, query: str) -> str:
return f'Instruct: {task_description}\nQuery: {query}'
# Each query must come with a one-sentence instruction that describes the task
task = 'Given a web search query, retrieve relevant passages that answer the query'
queries = [
get_detailed_instruct(task, 'how much protein should a female eat'),
get_detailed_instruct(task, 'summit define')
]
# No need to add instruction for retrieval documents
documents = [
"As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."
]
input_texts = queries + documents
tokenizer = AutoTokenizer.from_pretrained('Alibaba-NLP/gte-Qwen2-1.5B-instruct', trust_remote_code=True)
model = AutoModel.from_pretrained('Alibaba-NLP/gte-Qwen2-1.5B-instruct', trust_remote_code=True)
max_length = 8192
# Tokenize the input texts
batch_dict = tokenizer(input_texts, max_length=max_length, padding=True, truncation=True, return_tensors='pt')
outputs = model(**batch_dict)
embeddings = last_token_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
# normalize embeddings
embeddings = F.normalize(embeddings, p=2, dim=1)
scores = (embeddings[:2] @ embeddings[2:].T) * 100
print(scores.tolist())
```
## Evaluation
### MTEB & C-MTEB
You can use the [scripts/eval_mteb.py](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct/blob/main/scripts/eval_mteb.py) to reproduce the following result of **gte-Qwen2-1.5B-instruct** on MTEB(English)/C-MTEB(Chinese):
| Model Name | MTEB(56) | C-MTEB(35) | MTEB-fr(26) | MTEB-pl(26) |
|:----:|:---------:|:----------:|:----------:|:----------:|
| [bge-base-en-1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 64.23 | - | - | - |
| [bge-large-en-1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 63.55 | - | - | - |
| [gte-large-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | 65.39 | - | - | - |
| [gte-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | 64.11 | - | - | - |
| [mxbai-embed-large-v1](https://huggingface.co/mixedbread-ai/mxbai-embed-large-v1) | 64.68 | - | - | - |
| [acge_text_embedding](https://huggingface.co/aspire/acge_text_embedding) | - | 69.07 | - | - |
| [stella-mrl-large-zh-v3.5-1792d](https://huggingface.co/infgrad/stella-mrl-large-zh-v3.5-1792d) | - | 68.55 | - | - |
| [gte-large-zh](https://huggingface.co/thenlper/gte-large-zh) | - | 66.72 | - | - |
| [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 59.45 | 56.21 | - | - |
| [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 61.50 | 58.81 | - | - |
| [e5-mistral-7b-instruct](https://huggingface.co/intfloat/e5-mistral-7b-instruct) | 66.63 | 60.81 | - | - |
| [gte-Qwen1.5-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen1.5-7B-instruct) | 67.34 | 69.52 | - | - |
| [NV-Embed-v1](https://huggingface.co/nvidia/NV-Embed-v1) | 69.32 | - | - | - |
| [**gte-Qwen2-7B-instruct**](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) | **70.24** | **72.05** | **68.25** | **67.86** |
| [**gte-Qwen2-1.5B-instruct**](https://huggingface.co/Alibaba-NLP/gte-Qwen2-1.5B-instruct) | **67.16** | **67.65** | **66.60** | **64.04** |
### GTE Models
The gte series models have consistently released two types of models: encoder-only models (based on the BERT architecture) and decode-only models (based on the LLM architecture).
| Models | Language | Max Sequence Length | Dimension | Model Size (Memory Usage, fp32) |
|:-------------------------------------------------------------------------------------:|:--------:|:-----: |:---------:|:-------------------------------:|
| [GTE-large-zh](https://huggingface.co/thenlper/gte-large-zh) | Chinese | 512 | 1024 | 1.25GB |
| [GTE-base-zh](https://huggingface.co/thenlper/gte-base-zh) | Chinese | 512 | 512 | 0.41GB |
| [GTE-small-zh](https://huggingface.co/thenlper/gte-small-zh) | Chinese | 512 | 512 | 0.12GB |
| [GTE-large](https://huggingface.co/thenlper/gte-large) | English | 512 | 1024 | 1.25GB |
| [GTE-base](https://huggingface.co/thenlper/gte-base) | English | 512 | 512 | 0.21GB |
| [GTE-small](https://huggingface.co/thenlper/gte-small) | English | 512 | 384 | 0.10GB |
| [GTE-large-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | English | 8192 | 1024 | 1.74GB |
| [GTE-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-base-en-v1.5) | English | 8192 | 768 | 0.51GB |
| [GTE-Qwen1.5-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen1.5-7B-instruct) | Multilingual | 32000 | 4096 | 26.45GB |
| [GTE-Qwen2-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) | Multilingual | 32000 | 3584 | 26.45GB |
| [GTE-Qwen2-1.5B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen2-1.5B-instruct) | Multilingual | 32000 | 1536 | 6.62GB |
## Cloud API Services
In addition to the open-source [GTE](https://huggingface.co/collections/Alibaba-NLP/gte-models-6680f0b13f885cb431e6d469) series models, GTE series models are also available as commercial API services on Alibaba Cloud.
- [Embedding Models](https://help.aliyun.com/zh/model-studio/developer-reference/general-text-embedding/): Rhree versions of the text embedding models are available: text-embedding-v1/v2/v3, with v3 being the latest API service.
- [ReRank Models](https://help.aliyun.com/zh/model-studio/developer-reference/general-text-sorting-model/): The gte-rerank model service is available.
Note that the models behind the commercial APIs are not entirely identical to the open-source models.
## Citation
If you find our paper or models helpful, please consider cite:
```
@article{li2023towards,
title={Towards general text embeddings with multi-stage contrastive learning},
author={Li, Zehan and Zhang, Xin and Zhang, Yanzhao and Long, Dingkun and Xie, Pengjun and Zhang, Meishan},
journal={arXiv preprint arXiv:2308.03281},
year={2023}
}
```
| {"license": "apache-2.0", "tags": ["mteb", "sentence-transformers", "transformers", "Qwen2", "sentence-similarity"], "model-index": [{"name": "gte-qwen2-7B-instruct", "results": [{"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonCounterfactualClassification (en)", "type": "mteb/amazon_counterfactual", "config": "en", "split": "test", "revision": "e8379541af4e31359cca9fbcf4b00f2671dba205"}, "metrics": [{"type": "accuracy", "value": 83.98507462686567}, {"type": "ap", "value": 50.93015252587014}, {"type": "f1", "value": 78.50416599051215}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonPolarityClassification", "type": "mteb/amazon_polarity", "config": "default", "split": "test", "revision": "e2d317d38cd51312af73b3d32a06d1a08b442046"}, "metrics": [{"type": "accuracy", "value": 96.61065}, {"type": "ap", "value": 94.89174052954196}, {"type": "f1", "value": 96.60942596940565}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification (en)", "type": "mteb/amazon_reviews_multi", "config": "en", "split": "test", "revision": "1399c76144fd37290681b995c656ef9b2e06e26d"}, "metrics": [{"type": "accuracy", "value": 55.614000000000004}, {"type": "f1", "value": 54.90553480294904}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ArguAna", "type": "mteb/arguana", "config": "default", "split": "test", "revision": "c22ab2a51041ffd869aaddef7af8d8215647e41a"}, "metrics": [{"type": "map_at_1", "value": 45.164}, {"type": "map_at_10", "value": 61.519}, {"type": "map_at_100", "value": 61.769}, {"type": "map_at_1000", "value": 61.769}, {"type": "map_at_3", "value": 57.443999999999996}, {"type": "map_at_5", "value": 60.058}, {"type": "mrr_at_1", "value": 46.088}, {"type": "mrr_at_10", "value": 61.861}, {"type": "mrr_at_100", "value": 62.117999999999995}, {"type": "mrr_at_1000", "value": 62.117999999999995}, {"type": "mrr_at_3", "value": 57.729}, {"type": "mrr_at_5", "value": 60.392}, {"type": "ndcg_at_1", "value": 45.164}, {"type": "ndcg_at_10", "value": 69.72}, {"type": "ndcg_at_100", "value": 70.719}, {"type": "ndcg_at_1000", "value": 70.719}, {"type": "ndcg_at_3", "value": 61.517999999999994}, {"type": "ndcg_at_5", "value": 66.247}, {"type": "precision_at_1", "value": 45.164}, {"type": "precision_at_10", "value": 9.545}, {"type": "precision_at_100", "value": 0.996}, {"type": "precision_at_1000", "value": 0.1}, {"type": "precision_at_3", "value": 24.443}, {"type": "precision_at_5", "value": 16.97}, {"type": "recall_at_1", "value": 45.164}, {"type": "recall_at_10", "value": 95.448}, {"type": "recall_at_100", "value": 99.644}, {"type": "recall_at_1000", "value": 99.644}, {"type": "recall_at_3", "value": 73.329}, {"type": "recall_at_5", "value": 84.851}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringP2P", "type": "mteb/arxiv-clustering-p2p", "config": "default", "split": "test", "revision": "a122ad7f3f0291bf49cc6f4d32aa80929df69d5d"}, "metrics": [{"type": "v_measure", "value": 50.511868162026175}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringS2S", "type": "mteb/arxiv-clustering-s2s", "config": "default", "split": "test", "revision": "f910caf1a6075f7329cdf8c1a6135696f37dbd53"}, "metrics": [{"type": "v_measure", "value": 45.007803189284004}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB AskUbuntuDupQuestions", "type": "mteb/askubuntudupquestions-reranking", "config": "default", "split": "test", "revision": "2000358ca161889fa9c082cb41daa8dcfb161a54"}, "metrics": [{"type": "map", "value": 64.55292107723382}, {"type": "mrr", "value": 77.66158818097877}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB BIOSSES", "type": "mteb/biosses-sts", "config": "default", "split": "test", "revision": "d3fb88f8f02e40887cd149695127462bbcf29b4a"}, "metrics": [{"type": "cos_sim_pearson", "value": 85.65459047085452}, {"type": "cos_sim_spearman", "value": 82.10729255710761}, {"type": "euclidean_pearson", "value": 82.78079159312476}, {"type": "euclidean_spearman", "value": 80.50002701880933}, {"type": "manhattan_pearson", "value": 82.41372641383016}, {"type": "manhattan_spearman", "value": 80.57412509272639}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB Banking77Classification", "type": "mteb/banking77", "config": "default", "split": "test", "revision": "0fd18e25b25c072e09e0d92ab615fda904d66300"}, "metrics": [{"type": "accuracy", "value": 87.30844155844156}, {"type": "f1", "value": 87.25307322443255}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringP2P", "type": "mteb/biorxiv-clustering-p2p", "config": "default", "split": "test", "revision": "65b79d1d13f80053f67aca9498d9402c2d9f1f40"}, "metrics": [{"type": "v_measure", "value": 43.20754608934859}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringS2S", "type": "mteb/biorxiv-clustering-s2s", "config": "default", "split": "test", "revision": "258694dd0231531bc1fd9de6ceb52a0853c6d908"}, "metrics": [{"type": "v_measure", "value": 38.818037697335505}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackAndroidRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "f46a197baaae43b4f621051089b82a364682dfeb"}, "metrics": [{"type": "map_at_1", "value": 35.423}, {"type": "map_at_10", "value": 47.198}, {"type": "map_at_100", "value": 48.899}, {"type": "map_at_1000", "value": 49.004}, {"type": "map_at_3", "value": 43.114999999999995}, {"type": "map_at_5", "value": 45.491}, {"type": "mrr_at_1", "value": 42.918}, {"type": "mrr_at_10", "value": 53.299}, {"type": "mrr_at_100", "value": 54.032000000000004}, {"type": "mrr_at_1000", "value": 54.055}, {"type": "mrr_at_3", "value": 50.453}, {"type": "mrr_at_5", "value": 52.205999999999996}, {"type": "ndcg_at_1", "value": 42.918}, {"type": "ndcg_at_10", "value": 53.98}, {"type": "ndcg_at_100", "value": 59.57}, {"type": "ndcg_at_1000", "value": 60.879000000000005}, {"type": "ndcg_at_3", "value": 48.224000000000004}, {"type": "ndcg_at_5", "value": 50.998}, {"type": "precision_at_1", "value": 42.918}, {"type": "precision_at_10", "value": 10.299999999999999}, {"type": "precision_at_100", "value": 1.687}, {"type": "precision_at_1000", "value": 0.211}, {"type": "precision_at_3", "value": 22.842000000000002}, {"type": "precision_at_5", "value": 16.681}, {"type": "recall_at_1", "value": 35.423}, {"type": "recall_at_10", "value": 66.824}, {"type": "recall_at_100", "value": 89.564}, {"type": "recall_at_1000", "value": 97.501}, {"type": "recall_at_3", "value": 50.365}, {"type": "recall_at_5", "value": 57.921}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackEnglishRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "ad9991cb51e31e31e430383c75ffb2885547b5f0"}, "metrics": [{"type": "map_at_1", "value": 33.205}, {"type": "map_at_10", "value": 44.859}, {"type": "map_at_100", "value": 46.135}, {"type": "map_at_1000", "value": 46.259}, {"type": "map_at_3", "value": 41.839}, {"type": "map_at_5", "value": 43.662}, {"type": "mrr_at_1", "value": 41.146}, {"type": "mrr_at_10", "value": 50.621}, {"type": "mrr_at_100", "value": 51.207}, {"type": "mrr_at_1000", "value": 51.246}, {"type": "mrr_at_3", "value": 48.535000000000004}, {"type": "mrr_at_5", "value": 49.818}, {"type": "ndcg_at_1", "value": 41.146}, {"type": "ndcg_at_10", "value": 50.683}, {"type": "ndcg_at_100", "value": 54.82}, {"type": "ndcg_at_1000", "value": 56.69}, {"type": "ndcg_at_3", "value": 46.611000000000004}, {"type": "ndcg_at_5", "value": 48.66}, {"type": "precision_at_1", "value": 41.146}, {"type": "precision_at_10", "value": 9.439}, {"type": "precision_at_100", "value": 1.465}, {"type": "precision_at_1000", "value": 0.194}, {"type": "precision_at_3", "value": 22.59}, {"type": "precision_at_5", "value": 15.86}, {"type": "recall_at_1", "value": 33.205}, {"type": "recall_at_10", "value": 61.028999999999996}, {"type": "recall_at_100", "value": 78.152}, {"type": "recall_at_1000", "value": 89.59700000000001}, {"type": "recall_at_3", "value": 49.05}, {"type": "recall_at_5", "value": 54.836}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGamingRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "4885aa143210c98657558c04aaf3dc47cfb54340"}, "metrics": [{"type": "map_at_1", "value": 41.637}, {"type": "map_at_10", "value": 55.162}, {"type": "map_at_100", "value": 56.142}, {"type": "map_at_1000", "value": 56.188}, {"type": "map_at_3", "value": 51.564}, {"type": "map_at_5", "value": 53.696}, {"type": "mrr_at_1", "value": 47.524}, {"type": "mrr_at_10", "value": 58.243}, {"type": "mrr_at_100", "value": 58.879999999999995}, {"type": "mrr_at_1000", "value": 58.9}, {"type": "mrr_at_3", "value": 55.69499999999999}, {"type": "mrr_at_5", "value": 57.284}, {"type": "ndcg_at_1", "value": 47.524}, {"type": "ndcg_at_10", "value": 61.305}, {"type": "ndcg_at_100", "value": 65.077}, {"type": "ndcg_at_1000", "value": 65.941}, {"type": "ndcg_at_3", "value": 55.422000000000004}, {"type": "ndcg_at_5", "value": 58.516}, {"type": "precision_at_1", "value": 47.524}, {"type": "precision_at_10", "value": 9.918000000000001}, {"type": "precision_at_100", "value": 1.276}, {"type": "precision_at_1000", "value": 0.13899999999999998}, {"type": "precision_at_3", "value": 24.765}, {"type": "precision_at_5", "value": 17.204}, {"type": "recall_at_1", "value": 41.637}, {"type": "recall_at_10", "value": 76.185}, {"type": "recall_at_100", "value": 92.149}, {"type": "recall_at_1000", "value": 98.199}, {"type": "recall_at_3", "value": 60.856}, {"type": "recall_at_5", "value": 68.25099999999999}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGisRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "5003b3064772da1887988e05400cf3806fe491f2"}, "metrics": [{"type": "map_at_1", "value": 26.27}, {"type": "map_at_10", "value": 37.463}, {"type": "map_at_100", "value": 38.434000000000005}, {"type": "map_at_1000", "value": 38.509}, {"type": "map_at_3", "value": 34.226}, {"type": "map_at_5", "value": 36.161}, {"type": "mrr_at_1", "value": 28.588}, {"type": "mrr_at_10", "value": 39.383}, {"type": "mrr_at_100", "value": 40.23}, {"type": "mrr_at_1000", "value": 40.281}, {"type": "mrr_at_3", "value": 36.422}, {"type": "mrr_at_5", "value": 38.252}, {"type": "ndcg_at_1", "value": 28.588}, {"type": "ndcg_at_10", "value": 43.511}, {"type": "ndcg_at_100", "value": 48.274}, {"type": "ndcg_at_1000", "value": 49.975}, {"type": "ndcg_at_3", "value": 37.319}, {"type": "ndcg_at_5", "value": 40.568}, {"type": "precision_at_1", "value": 28.588}, {"type": "precision_at_10", "value": 6.893000000000001}, {"type": "precision_at_100", "value": 0.9900000000000001}, {"type": "precision_at_1000", "value": 0.117}, {"type": "precision_at_3", "value": 16.347}, {"type": "precision_at_5", "value": 11.661000000000001}, {"type": "recall_at_1", "value": 26.27}, {"type": "recall_at_10", "value": 60.284000000000006}, {"type": "recall_at_100", "value": 81.902}, {"type": "recall_at_1000", "value": 94.43}, {"type": "recall_at_3", "value": 43.537}, {"type": "recall_at_5", "value": 51.475}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackMathematicaRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "90fceea13679c63fe563ded68f3b6f06e50061de"}, "metrics": [{"type": "map_at_1", "value": 18.168}, {"type": "map_at_10", "value": 28.410000000000004}, {"type": "map_at_100", "value": 29.78}, {"type": "map_at_1000", "value": 29.892999999999997}, {"type": "map_at_3", "value": 25.238}, {"type": "map_at_5", "value": 26.96}, {"type": "mrr_at_1", "value": 23.507}, {"type": "mrr_at_10", "value": 33.382}, {"type": "mrr_at_100", "value": 34.404}, {"type": "mrr_at_1000", "value": 34.467999999999996}, {"type": "mrr_at_3", "value": 30.637999999999998}, {"type": "mrr_at_5", "value": 32.199}, {"type": "ndcg_at_1", "value": 23.507}, {"type": "ndcg_at_10", "value": 34.571000000000005}, {"type": "ndcg_at_100", "value": 40.663}, {"type": "ndcg_at_1000", "value": 43.236000000000004}, {"type": "ndcg_at_3", "value": 29.053}, {"type": "ndcg_at_5", "value": 31.563999999999997}, {"type": "precision_at_1", "value": 23.507}, {"type": "precision_at_10", "value": 6.654}, {"type": "precision_at_100", "value": 1.113}, {"type": "precision_at_1000", "value": 0.146}, {"type": "precision_at_3", "value": 14.427999999999999}, {"type": "precision_at_5", "value": 10.498000000000001}, {"type": "recall_at_1", "value": 18.168}, {"type": "recall_at_10", "value": 48.443000000000005}, {"type": "recall_at_100", "value": 74.47}, {"type": "recall_at_1000", "value": 92.494}, {"type": "recall_at_3", "value": 33.379999999999995}, {"type": "recall_at_5", "value": 39.76}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackPhysicsRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4"}, "metrics": [{"type": "map_at_1", "value": 32.39}, {"type": "map_at_10", "value": 44.479}, {"type": "map_at_100", "value": 45.977000000000004}, {"type": "map_at_1000", "value": 46.087}, {"type": "map_at_3", "value": 40.976}, {"type": "map_at_5", "value": 43.038}, {"type": "mrr_at_1", "value": 40.135}, {"type": "mrr_at_10", "value": 50.160000000000004}, {"type": "mrr_at_100", "value": 51.052}, {"type": "mrr_at_1000", "value": 51.087}, {"type": "mrr_at_3", "value": 47.818}, {"type": "mrr_at_5", "value": 49.171}, {"type": "ndcg_at_1", "value": 40.135}, {"type": "ndcg_at_10", "value": 50.731}, {"type": "ndcg_at_100", "value": 56.452000000000005}, {"type": "ndcg_at_1000", "value": 58.123000000000005}, {"type": "ndcg_at_3", "value": 45.507}, {"type": "ndcg_at_5", "value": 48.11}, {"type": "precision_at_1", "value": 40.135}, {"type": "precision_at_10", "value": 9.192}, {"type": "precision_at_100", "value": 1.397}, {"type": "precision_at_1000", "value": 0.169}, {"type": "precision_at_3", "value": 21.816}, {"type": "precision_at_5", "value": 15.476}, {"type": "recall_at_1", "value": 32.39}, {"type": "recall_at_10", "value": 63.597}, {"type": "recall_at_100", "value": 86.737}, {"type": "recall_at_1000", "value": 97.039}, {"type": "recall_at_3", "value": 48.906}, {"type": "recall_at_5", "value": 55.659000000000006}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackProgrammersRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "6184bc1440d2dbc7612be22b50686b8826d22b32"}, "metrics": [{"type": "map_at_1", "value": 28.397}, {"type": "map_at_10", "value": 39.871}, {"type": "map_at_100", "value": 41.309000000000005}, {"type": "map_at_1000", "value": 41.409}, {"type": "map_at_3", "value": 36.047000000000004}, {"type": "map_at_5", "value": 38.104}, {"type": "mrr_at_1", "value": 34.703}, {"type": "mrr_at_10", "value": 44.773}, {"type": "mrr_at_100", "value": 45.64}, {"type": "mrr_at_1000", "value": 45.678999999999995}, {"type": "mrr_at_3", "value": 41.705}, {"type": "mrr_at_5", "value": 43.406}, {"type": "ndcg_at_1", "value": 34.703}, {"type": "ndcg_at_10", "value": 46.271}, {"type": "ndcg_at_100", "value": 52.037}, {"type": "ndcg_at_1000", "value": 53.81700000000001}, {"type": "ndcg_at_3", "value": 39.966}, {"type": "ndcg_at_5", "value": 42.801}, {"type": "precision_at_1", "value": 34.703}, {"type": "precision_at_10", "value": 8.744}, {"type": "precision_at_100", "value": 1.348}, {"type": "precision_at_1000", "value": 0.167}, {"type": "precision_at_3", "value": 19.102}, {"type": "precision_at_5", "value": 13.836}, {"type": "recall_at_1", "value": 28.397}, {"type": "recall_at_10", "value": 60.299}, {"type": "recall_at_100", "value": 84.595}, {"type": "recall_at_1000", "value": 96.155}, {"type": "recall_at_3", "value": 43.065}, {"type": "recall_at_5", "value": 50.371}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "4ffe81d471b1924886b33c7567bfb200e9eec5c4"}, "metrics": [{"type": "map_at_1", "value": 28.044333333333338}, {"type": "map_at_10", "value": 38.78691666666666}, {"type": "map_at_100", "value": 40.113}, {"type": "map_at_1000", "value": 40.22125}, {"type": "map_at_3", "value": 35.52966666666667}, {"type": "map_at_5", "value": 37.372749999999996}, {"type": "mrr_at_1", "value": 33.159083333333335}, {"type": "mrr_at_10", "value": 42.913583333333335}, {"type": "mrr_at_100", "value": 43.7845}, {"type": "mrr_at_1000", "value": 43.830333333333336}, {"type": "mrr_at_3", "value": 40.29816666666667}, {"type": "mrr_at_5", "value": 41.81366666666667}, {"type": "ndcg_at_1", "value": 33.159083333333335}, {"type": "ndcg_at_10", "value": 44.75750000000001}, {"type": "ndcg_at_100", "value": 50.13658333333334}, {"type": "ndcg_at_1000", "value": 52.037}, {"type": "ndcg_at_3", "value": 39.34258333333334}, {"type": "ndcg_at_5", "value": 41.93708333333333}, {"type": "precision_at_1", "value": 33.159083333333335}, {"type": "precision_at_10", "value": 7.952416666666667}, {"type": "precision_at_100", "value": 1.2571666666666668}, {"type": "precision_at_1000", "value": 0.16099999999999998}, {"type": "precision_at_3", "value": 18.303833333333337}, {"type": "precision_at_5", "value": 13.057083333333333}, {"type": "recall_at_1", "value": 28.044333333333338}, {"type": "recall_at_10", "value": 58.237249999999996}, {"type": "recall_at_100", "value": 81.35391666666666}, {"type": "recall_at_1000", "value": 94.21283333333334}, {"type": "recall_at_3", "value": 43.32341666666667}, {"type": "recall_at_5", "value": 49.94908333333333}, {"type": "map_at_1", "value": 18.398}, {"type": "map_at_10", "value": 27.929}, {"type": "map_at_100", "value": 29.032999999999998}, {"type": "map_at_1000", "value": 29.126}, {"type": "map_at_3", "value": 25.070999999999998}, {"type": "map_at_5", "value": 26.583000000000002}, {"type": "mrr_at_1", "value": 19.963}, {"type": "mrr_at_10", "value": 29.997}, {"type": "mrr_at_100", "value": 30.9}, {"type": "mrr_at_1000", "value": 30.972}, {"type": "mrr_at_3", "value": 27.264}, {"type": "mrr_at_5", "value": 28.826}, {"type": "ndcg_at_1", "value": 19.963}, {"type": "ndcg_at_10", "value": 33.678999999999995}, {"type": "ndcg_at_100", "value": 38.931}, {"type": "ndcg_at_1000", "value": 41.379}, {"type": "ndcg_at_3", "value": 28.000000000000004}, {"type": "ndcg_at_5", "value": 30.637999999999998}, {"type": "precision_at_1", "value": 19.963}, {"type": "precision_at_10", "value": 5.7299999999999995}, {"type": "precision_at_100", "value": 0.902}, {"type": "precision_at_1000", "value": 0.122}, {"type": "precision_at_3", "value": 12.631}, {"type": "precision_at_5", "value": 9.057}, {"type": "recall_at_1", "value": 18.398}, {"type": "recall_at_10", "value": 49.254}, {"type": "recall_at_100", "value": 73.182}, {"type": "recall_at_1000", "value": 91.637}, {"type": "recall_at_3", "value": 34.06}, {"type": "recall_at_5", "value": 40.416000000000004}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackStatsRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "65ac3a16b8e91f9cee4c9828cc7c335575432a2a"}, "metrics": [{"type": "map_at_1", "value": 27.838}, {"type": "map_at_10", "value": 36.04}, {"type": "map_at_100", "value": 37.113}, {"type": "map_at_1000", "value": 37.204}, {"type": "map_at_3", "value": 33.585}, {"type": "map_at_5", "value": 34.845}, {"type": "mrr_at_1", "value": 30.982}, {"type": "mrr_at_10", "value": 39.105000000000004}, {"type": "mrr_at_100", "value": 39.98}, {"type": "mrr_at_1000", "value": 40.042}, {"type": "mrr_at_3", "value": 36.912}, {"type": "mrr_at_5", "value": 38.062000000000005}, {"type": "ndcg_at_1", "value": 30.982}, {"type": "ndcg_at_10", "value": 40.982}, {"type": "ndcg_at_100", "value": 46.092}, {"type": "ndcg_at_1000", "value": 48.25}, {"type": "ndcg_at_3", "value": 36.41}, {"type": "ndcg_at_5", "value": 38.379999999999995}, {"type": "precision_at_1", "value": 30.982}, {"type": "precision_at_10", "value": 6.534}, {"type": "precision_at_100", "value": 0.9820000000000001}, {"type": "precision_at_1000", "value": 0.124}, {"type": "precision_at_3", "value": 15.745999999999999}, {"type": "precision_at_5", "value": 10.828}, {"type": "recall_at_1", "value": 27.838}, {"type": "recall_at_10", "value": 52.971000000000004}, {"type": "recall_at_100", "value": 76.357}, {"type": "recall_at_1000", "value": 91.973}, {"type": "recall_at_3", "value": 40.157}, {"type": "recall_at_5", "value": 45.147999999999996}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackTexRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "46989137a86843e03a6195de44b09deda022eec7"}, "metrics": [{"type": "map_at_1", "value": 19.059}, {"type": "map_at_10", "value": 27.454}, {"type": "map_at_100", "value": 28.736}, {"type": "map_at_1000", "value": 28.865000000000002}, {"type": "map_at_3", "value": 24.773999999999997}, {"type": "map_at_5", "value": 26.266000000000002}, {"type": "mrr_at_1", "value": 23.125}, {"type": "mrr_at_10", "value": 31.267}, {"type": "mrr_at_100", "value": 32.32}, {"type": "mrr_at_1000", "value": 32.394}, {"type": "mrr_at_3", "value": 28.894}, {"type": "mrr_at_5", "value": 30.281000000000002}, {"type": "ndcg_at_1", "value": 23.125}, {"type": "ndcg_at_10", "value": 32.588}, {"type": "ndcg_at_100", "value": 38.432}, {"type": "ndcg_at_1000", "value": 41.214}, {"type": "ndcg_at_3", "value": 27.938000000000002}, {"type": "ndcg_at_5", "value": 30.127}, {"type": "precision_at_1", "value": 23.125}, {"type": "precision_at_10", "value": 5.9639999999999995}, {"type": "precision_at_100", "value": 1.047}, {"type": "precision_at_1000", "value": 0.148}, {"type": "precision_at_3", "value": 13.294}, {"type": "precision_at_5", "value": 9.628}, {"type": "recall_at_1", "value": 19.059}, {"type": "recall_at_10", "value": 44.25}, {"type": "recall_at_100", "value": 69.948}, {"type": "recall_at_1000", "value": 89.35300000000001}, {"type": "recall_at_3", "value": 31.114000000000004}, {"type": "recall_at_5", "value": 36.846000000000004}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackUnixRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "6c6430d3a6d36f8d2a829195bc5dc94d7e063e53"}, "metrics": [{"type": "map_at_1", "value": 28.355999999999998}, {"type": "map_at_10", "value": 39.055}, {"type": "map_at_100", "value": 40.486}, {"type": "map_at_1000", "value": 40.571}, {"type": "map_at_3", "value": 35.69}, {"type": "map_at_5", "value": 37.605}, {"type": "mrr_at_1", "value": 33.302}, {"type": "mrr_at_10", "value": 42.986000000000004}, {"type": "mrr_at_100", "value": 43.957}, {"type": "mrr_at_1000", "value": 43.996}, {"type": "mrr_at_3", "value": 40.111999999999995}, {"type": "mrr_at_5", "value": 41.735}, {"type": "ndcg_at_1", "value": 33.302}, {"type": "ndcg_at_10", "value": 44.962999999999994}, {"type": "ndcg_at_100", "value": 50.917}, {"type": "ndcg_at_1000", "value": 52.622}, {"type": "ndcg_at_3", "value": 39.182}, {"type": "ndcg_at_5", "value": 41.939}, {"type": "precision_at_1", "value": 33.302}, {"type": "precision_at_10", "value": 7.779999999999999}, {"type": "precision_at_100", "value": 1.203}, {"type": "precision_at_1000", "value": 0.145}, {"type": "precision_at_3", "value": 18.035}, {"type": "precision_at_5", "value": 12.873000000000001}, {"type": "recall_at_1", "value": 28.355999999999998}, {"type": "recall_at_10", "value": 58.782000000000004}, {"type": "recall_at_100", "value": 84.02199999999999}, {"type": "recall_at_1000", "value": 95.511}, {"type": "recall_at_3", "value": 43.126999999999995}, {"type": "recall_at_5", "value": 50.14999999999999}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWebmastersRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "160c094312a0e1facb97e55eeddb698c0abe3571"}, "metrics": [{"type": "map_at_1", "value": 27.391}, {"type": "map_at_10", "value": 37.523}, {"type": "map_at_100", "value": 39.312000000000005}, {"type": "map_at_1000", "value": 39.54}, {"type": "map_at_3", "value": 34.231}, {"type": "map_at_5", "value": 36.062}, {"type": "mrr_at_1", "value": 32.016}, {"type": "mrr_at_10", "value": 41.747}, {"type": "mrr_at_100", "value": 42.812}, {"type": "mrr_at_1000", "value": 42.844}, {"type": "mrr_at_3", "value": 39.129999999999995}, {"type": "mrr_at_5", "value": 40.524}, {"type": "ndcg_at_1", "value": 32.016}, {"type": "ndcg_at_10", "value": 43.826}, {"type": "ndcg_at_100", "value": 50.373999999999995}, {"type": "ndcg_at_1000", "value": 52.318}, {"type": "ndcg_at_3", "value": 38.479}, {"type": "ndcg_at_5", "value": 40.944}, {"type": "precision_at_1", "value": 32.016}, {"type": "precision_at_10", "value": 8.280999999999999}, {"type": "precision_at_100", "value": 1.6760000000000002}, {"type": "precision_at_1000", "value": 0.25}, {"type": "precision_at_3", "value": 18.05}, {"type": "precision_at_5", "value": 13.083}, {"type": "recall_at_1", "value": 27.391}, {"type": "recall_at_10", "value": 56.928999999999995}, {"type": "recall_at_100", "value": 85.169}, {"type": "recall_at_1000", "value": 96.665}, {"type": "recall_at_3", "value": 42.264}, {"type": "recall_at_5", "value": 48.556}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ClimateFEVER", "type": "mteb/climate-fever", "config": "default", "split": "test", "revision": "47f2ac6acb640fc46020b02a5b59fdda04d39380"}, "metrics": [{"type": "map_at_1", "value": 19.681}, {"type": "map_at_10", "value": 32.741}, {"type": "map_at_100", "value": 34.811}, {"type": "map_at_1000", "value": 35.003}, {"type": "map_at_3", "value": 27.697}, {"type": "map_at_5", "value": 30.372}, {"type": "mrr_at_1", "value": 44.951}, {"type": "mrr_at_10", "value": 56.34400000000001}, {"type": "mrr_at_100", "value": 56.961}, {"type": "mrr_at_1000", "value": 56.987}, {"type": "mrr_at_3", "value": 53.681}, {"type": "mrr_at_5", "value": 55.407}, {"type": "ndcg_at_1", "value": 44.951}, {"type": "ndcg_at_10", "value": 42.905}, {"type": "ndcg_at_100", "value": 49.95}, {"type": "ndcg_at_1000", "value": 52.917}, {"type": "ndcg_at_3", "value": 36.815}, {"type": "ndcg_at_5", "value": 38.817}, {"type": "precision_at_1", "value": 44.951}, {"type": "precision_at_10", "value": 12.989999999999998}, {"type": "precision_at_100", "value": 2.068}, {"type": "precision_at_1000", "value": 0.263}, {"type": "precision_at_3", "value": 27.275}, {"type": "precision_at_5", "value": 20.365}, {"type": "recall_at_1", "value": 19.681}, {"type": "recall_at_10", "value": 48.272999999999996}, {"type": "recall_at_100", "value": 71.87400000000001}, {"type": "recall_at_1000", "value": 87.929}, {"type": "recall_at_3", "value": 32.653999999999996}, {"type": "recall_at_5", "value": 39.364}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB DBPedia", "type": "mteb/dbpedia", "config": "default", "split": "test", "revision": "c0f706b76e590d620bd6618b3ca8efdd34e2d659"}, "metrics": [{"type": "map_at_1", "value": 10.231}, {"type": "map_at_10", "value": 22.338}, {"type": "map_at_100", "value": 31.927}, {"type": "map_at_1000", "value": 33.87}, {"type": "map_at_3", "value": 15.559999999999999}, {"type": "map_at_5", "value": 18.239}, {"type": "mrr_at_1", "value": 75.0}, {"type": "mrr_at_10", "value": 81.303}, {"type": "mrr_at_100", "value": 81.523}, {"type": "mrr_at_1000", "value": 81.53}, {"type": "mrr_at_3", "value": 80.083}, {"type": "mrr_at_5", "value": 80.758}, {"type": "ndcg_at_1", "value": 64.625}, {"type": "ndcg_at_10", "value": 48.687000000000005}, {"type": "ndcg_at_100", "value": 52.791}, {"type": "ndcg_at_1000", "value": 60.041999999999994}, {"type": "ndcg_at_3", "value": 53.757999999999996}, {"type": "ndcg_at_5", "value": 50.76500000000001}, {"type": "precision_at_1", "value": 75.0}, {"type": "precision_at_10", "value": 38.3}, {"type": "precision_at_100", "value": 12.025}, {"type": "precision_at_1000", "value": 2.3970000000000002}, {"type": "precision_at_3", "value": 55.417}, {"type": "precision_at_5", "value": 47.5}, {"type": "recall_at_1", "value": 10.231}, {"type": "recall_at_10", "value": 27.697}, {"type": "recall_at_100", "value": 57.409}, {"type": "recall_at_1000", "value": 80.547}, {"type": "recall_at_3", "value": 16.668}, {"type": "recall_at_5", "value": 20.552}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB EmotionClassification", "type": "mteb/emotion", "config": "default", "split": "test", "revision": "4f58c6b202a23cf9a4da393831edf4f9183cad37"}, "metrics": [{"type": "accuracy", "value": 61.365}, {"type": "f1", "value": 56.7540827912991}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FEVER", "type": "mteb/fever", "config": "default", "split": "test", "revision": "bea83ef9e8fb933d90a2f1d5515737465d613e12"}, "metrics": [{"type": "map_at_1", "value": 83.479}, {"type": "map_at_10", "value": 88.898}, {"type": "map_at_100", "value": 89.11}, {"type": "map_at_1000", "value": 89.12400000000001}, {"type": "map_at_3", "value": 88.103}, {"type": "map_at_5", "value": 88.629}, {"type": "mrr_at_1", "value": 89.934}, {"type": "mrr_at_10", "value": 93.91000000000001}, {"type": "mrr_at_100", "value": 93.937}, {"type": "mrr_at_1000", "value": 93.938}, {"type": "mrr_at_3", "value": 93.62700000000001}, {"type": "mrr_at_5", "value": 93.84599999999999}, {"type": "ndcg_at_1", "value": 89.934}, {"type": "ndcg_at_10", "value": 91.574}, {"type": "ndcg_at_100", "value": 92.238}, {"type": "ndcg_at_1000", "value": 92.45}, {"type": "ndcg_at_3", "value": 90.586}, {"type": "ndcg_at_5", "value": 91.16300000000001}, {"type": "precision_at_1", "value": 89.934}, {"type": "precision_at_10", "value": 10.555}, {"type": "precision_at_100", "value": 1.1159999999999999}, {"type": "precision_at_1000", "value": 0.11499999999999999}, {"type": "precision_at_3", "value": 33.588}, {"type": "precision_at_5", "value": 20.642}, {"type": "recall_at_1", "value": 83.479}, {"type": "recall_at_10", "value": 94.971}, {"type": "recall_at_100", "value": 97.397}, {"type": "recall_at_1000", "value": 98.666}, {"type": "recall_at_3", "value": 92.24799999999999}, {"type": "recall_at_5", "value": 93.797}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FiQA2018", "type": "mteb/fiqa", "config": "default", "split": "test", "revision": "27a168819829fe9bcd655c2df245fb19452e8e06"}, "metrics": [{"type": "map_at_1", "value": 27.16}, {"type": "map_at_10", "value": 45.593}, {"type": "map_at_100", "value": 47.762}, {"type": "map_at_1000", "value": 47.899}, {"type": "map_at_3", "value": 39.237}, {"type": "map_at_5", "value": 42.970000000000006}, {"type": "mrr_at_1", "value": 52.623}, {"type": "mrr_at_10", "value": 62.637}, {"type": "mrr_at_100", "value": 63.169}, {"type": "mrr_at_1000", "value": 63.185}, {"type": "mrr_at_3", "value": 59.928000000000004}, {"type": "mrr_at_5", "value": 61.702999999999996}, {"type": "ndcg_at_1", "value": 52.623}, {"type": "ndcg_at_10", "value": 54.701}, {"type": "ndcg_at_100", "value": 61.263}, {"type": "ndcg_at_1000", "value": 63.134}, {"type": "ndcg_at_3", "value": 49.265}, {"type": "ndcg_at_5", "value": 51.665000000000006}, {"type": "precision_at_1", "value": 52.623}, {"type": "precision_at_10", "value": 15.185}, {"type": "precision_at_100", "value": 2.202}, {"type": "precision_at_1000", "value": 0.254}, {"type": "precision_at_3", "value": 32.767}, {"type": "precision_at_5", "value": 24.722}, {"type": "recall_at_1", "value": 27.16}, {"type": "recall_at_10", "value": 63.309000000000005}, {"type": "recall_at_100", "value": 86.722}, {"type": "recall_at_1000", "value": 97.505}, {"type": "recall_at_3", "value": 45.045}, {"type": "recall_at_5", "value": 54.02400000000001}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB HotpotQA", "type": "mteb/hotpotqa", "config": "default", "split": "test", "revision": "ab518f4d6fcca38d87c25209f94beba119d02014"}, "metrics": [{"type": "map_at_1", "value": 42.573}, {"type": "map_at_10", "value": 59.373}, {"type": "map_at_100", "value": 60.292}, {"type": "map_at_1000", "value": 60.358999999999995}, {"type": "map_at_3", "value": 56.159000000000006}, {"type": "map_at_5", "value": 58.123999999999995}, {"type": "mrr_at_1", "value": 85.14500000000001}, {"type": "mrr_at_10", "value": 89.25999999999999}, {"type": "mrr_at_100", "value": 89.373}, {"type": "mrr_at_1000", "value": 89.377}, {"type": "mrr_at_3", "value": 88.618}, {"type": "mrr_at_5", "value": 89.036}, {"type": "ndcg_at_1", "value": 85.14500000000001}, {"type": "ndcg_at_10", "value": 68.95}, {"type": "ndcg_at_100", "value": 71.95}, {"type": "ndcg_at_1000", "value": 73.232}, {"type": "ndcg_at_3", "value": 64.546}, {"type": "ndcg_at_5", "value": 66.945}, {"type": "precision_at_1", "value": 85.14500000000001}, {"type": "precision_at_10", "value": 13.865}, {"type": "precision_at_100", "value": 1.619}, {"type": "precision_at_1000", "value": 0.179}, {"type": "precision_at_3", "value": 39.703}, {"type": "precision_at_5", "value": 25.718000000000004}, {"type": "recall_at_1", "value": 42.573}, {"type": "recall_at_10", "value": 69.325}, {"type": "recall_at_100", "value": 80.932}, {"type": "recall_at_1000", "value": 89.446}, {"type": "recall_at_3", "value": 59.553999999999995}, {"type": "recall_at_5", "value": 64.294}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ImdbClassification", "type": "mteb/imdb", "config": "default", "split": "test", "revision": "3d86128a09e091d6018b6d26cad27f2739fc2db7"}, "metrics": [{"type": "accuracy", "value": 95.8336}, {"type": "ap", "value": 93.78862962194073}, {"type": "f1", "value": 95.83192650728371}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MSMARCO", "type": "mteb/msmarco", "config": "default", "split": "dev", "revision": "c5a29a104738b98a9e76336939199e264163d4a0"}, "metrics": [{"type": "map_at_1", "value": 23.075000000000003}, {"type": "map_at_10", "value": 36.102000000000004}, {"type": "map_at_100", "value": 37.257}, {"type": "map_at_1000", "value": 37.3}, {"type": "map_at_3", "value": 32.144}, {"type": "map_at_5", "value": 34.359}, {"type": "mrr_at_1", "value": 23.711}, {"type": "mrr_at_10", "value": 36.671}, {"type": "mrr_at_100", "value": 37.763999999999996}, {"type": "mrr_at_1000", "value": 37.801}, {"type": "mrr_at_3", "value": 32.775}, {"type": "mrr_at_5", "value": 34.977000000000004}, {"type": "ndcg_at_1", "value": 23.711}, {"type": "ndcg_at_10", "value": 43.361}, {"type": "ndcg_at_100", "value": 48.839}, {"type": "ndcg_at_1000", "value": 49.88}, {"type": "ndcg_at_3", "value": 35.269}, {"type": "ndcg_at_5", "value": 39.224}, {"type": "precision_at_1", "value": 23.711}, {"type": "precision_at_10", "value": 6.866999999999999}, {"type": "precision_at_100", "value": 0.96}, {"type": "precision_at_1000", "value": 0.105}, {"type": "precision_at_3", "value": 15.096000000000002}, {"type": "precision_at_5", "value": 11.083}, {"type": "recall_at_1", "value": 23.075000000000003}, {"type": "recall_at_10", "value": 65.756}, {"type": "recall_at_100", "value": 90.88199999999999}, {"type": "recall_at_1000", "value": 98.739}, {"type": "recall_at_3", "value": 43.691}, {"type": "recall_at_5", "value": 53.15800000000001}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPDomainClassification (en)", "type": "mteb/mtop_domain", "config": "en", "split": "test", "revision": "d80d48c1eb48d3562165c59d59d0034df9fff0bf"}, "metrics": [{"type": "accuracy", "value": 97.69493844049248}, {"type": "f1", "value": 97.55048089616261}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPIntentClassification (en)", "type": "mteb/mtop_intent", "config": "en", "split": "test", "revision": "ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba"}, "metrics": [{"type": "accuracy", "value": 88.75968992248062}, {"type": "f1", "value": 72.26321223399123}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (en)", "type": "mteb/amazon_massive_intent", "config": "en", "split": "test", "revision": "31efe3c427b0bae9c22cbb560b8f15491cc6bed7"}, "metrics": [{"type": "accuracy", "value": 82.40080699394754}, {"type": "f1", "value": 79.62590029057968}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (en)", "type": "mteb/amazon_massive_scenario", "config": "en", "split": "test", "revision": "7d571f92784cd94a019292a1f45445077d0ef634"}, "metrics": [{"type": "accuracy", "value": 84.49562878278414}, {"type": "f1", "value": 84.0040193313333}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringP2P", "type": "mteb/medrxiv-clustering-p2p", "config": "default", "split": "test", "revision": "e7a26af6f3ae46b30dde8737f02c07b1505bcc73"}, "metrics": [{"type": "v_measure", "value": 39.386760057101945}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringS2S", "type": "mteb/medrxiv-clustering-s2s", "config": "default", "split": "test", "revision": "35191c8c0dca72d8ff3efcd72aa802307d469663"}, "metrics": [{"type": "v_measure", "value": 37.89687154075537}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB MindSmallReranking", "type": "mteb/mind_small", "config": "default", "split": "test", "revision": "3bdac13927fdc888b903db93b2ffdbd90b295a69"}, "metrics": [{"type": "map", "value": 33.94151656057482}, {"type": "mrr", "value": 35.32684700746953}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NFCorpus", "type": "mteb/nfcorpus", "config": "default", "split": "test", "revision": "ec0fa4fe99da2ff19ca1214b7966684033a58814"}, "metrics": [{"type": "map_at_1", "value": 6.239999999999999}, {"type": "map_at_10", "value": 14.862}, {"type": "map_at_100", "value": 18.955}, {"type": "map_at_1000", "value": 20.694000000000003}, {"type": "map_at_3", "value": 10.683}, {"type": "map_at_5", "value": 12.674}, {"type": "mrr_at_1", "value": 50.15500000000001}, {"type": "mrr_at_10", "value": 59.697}, {"type": "mrr_at_100", "value": 60.095}, {"type": "mrr_at_1000", "value": 60.129999999999995}, {"type": "mrr_at_3", "value": 58.35900000000001}, {"type": "mrr_at_5", "value": 58.839}, {"type": "ndcg_at_1", "value": 48.452}, {"type": "ndcg_at_10", "value": 39.341}, {"type": "ndcg_at_100", "value": 35.866}, {"type": "ndcg_at_1000", "value": 45.111000000000004}, {"type": "ndcg_at_3", "value": 44.527}, {"type": "ndcg_at_5", "value": 42.946}, {"type": "precision_at_1", "value": 50.15500000000001}, {"type": "precision_at_10", "value": 29.536}, {"type": "precision_at_100", "value": 9.142}, {"type": "precision_at_1000", "value": 2.2849999999999997}, {"type": "precision_at_3", "value": 41.899}, {"type": "precision_at_5", "value": 37.647000000000006}, {"type": "recall_at_1", "value": 6.239999999999999}, {"type": "recall_at_10", "value": 19.278000000000002}, {"type": "recall_at_100", "value": 36.074}, {"type": "recall_at_1000", "value": 70.017}, {"type": "recall_at_3", "value": 12.066}, {"type": "recall_at_5", "value": 15.254000000000001}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NQ", "type": "mteb/nq", "config": "default", "split": "test", "revision": "b774495ed302d8c44a3a7ea25c90dbce03968f31"}, "metrics": [{"type": "map_at_1", "value": 39.75}, {"type": "map_at_10", "value": 56.443}, {"type": "map_at_100", "value": 57.233999999999995}, {"type": "map_at_1000", "value": 57.249}, {"type": "map_at_3", "value": 52.032999999999994}, {"type": "map_at_5", "value": 54.937999999999995}, {"type": "mrr_at_1", "value": 44.728}, {"type": "mrr_at_10", "value": 58.939}, {"type": "mrr_at_100", "value": 59.489000000000004}, {"type": "mrr_at_1000", "value": 59.499}, {"type": "mrr_at_3", "value": 55.711999999999996}, {"type": "mrr_at_5", "value": 57.89}, {"type": "ndcg_at_1", "value": 44.728}, {"type": "ndcg_at_10", "value": 63.998999999999995}, {"type": "ndcg_at_100", "value": 67.077}, {"type": "ndcg_at_1000", "value": 67.40899999999999}, {"type": "ndcg_at_3", "value": 56.266000000000005}, {"type": "ndcg_at_5", "value": 60.88}, {"type": "precision_at_1", "value": 44.728}, {"type": "precision_at_10", "value": 10.09}, {"type": "precision_at_100", "value": 1.1809999999999998}, {"type": "precision_at_1000", "value": 0.121}, {"type": "precision_at_3", "value": 25.145}, {"type": "precision_at_5", "value": 17.822}, {"type": "recall_at_1", "value": 39.75}, {"type": "recall_at_10", "value": 84.234}, {"type": "recall_at_100", "value": 97.055}, {"type": "recall_at_1000", "value": 99.517}, {"type": "recall_at_3", "value": 64.851}, {"type": "recall_at_5", "value": 75.343}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB QuoraRetrieval", "type": "mteb/quora", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 72.085}, {"type": "map_at_10", "value": 86.107}, {"type": "map_at_100", "value": 86.727}, {"type": "map_at_1000", "value": 86.74}, {"type": "map_at_3", "value": 83.21}, {"type": "map_at_5", "value": 85.06}, {"type": "mrr_at_1", "value": 82.94}, {"type": "mrr_at_10", "value": 88.845}, {"type": "mrr_at_100", "value": 88.926}, {"type": "mrr_at_1000", "value": 88.927}, {"type": "mrr_at_3", "value": 87.993}, {"type": "mrr_at_5", "value": 88.62299999999999}, {"type": "ndcg_at_1", "value": 82.97}, {"type": "ndcg_at_10", "value": 89.645}, {"type": "ndcg_at_100", "value": 90.717}, {"type": "ndcg_at_1000", "value": 90.78}, {"type": "ndcg_at_3", "value": 86.99900000000001}, {"type": "ndcg_at_5", "value": 88.52600000000001}, {"type": "precision_at_1", "value": 82.97}, {"type": "precision_at_10", "value": 13.569}, {"type": "precision_at_100", "value": 1.539}, {"type": "precision_at_1000", "value": 0.157}, {"type": "precision_at_3", "value": 38.043}, {"type": "precision_at_5", "value": 24.992}, {"type": "recall_at_1", "value": 72.085}, {"type": "recall_at_10", "value": 96.262}, {"type": "recall_at_100", "value": 99.77000000000001}, {"type": "recall_at_1000", "value": 99.997}, {"type": "recall_at_3", "value": 88.652}, {"type": "recall_at_5", "value": 93.01899999999999}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClustering", "type": "mteb/reddit-clustering", "config": "default", "split": "test", "revision": "24640382cdbf8abc73003fb0fa6d111a705499eb"}, "metrics": [{"type": "v_measure", "value": 55.82153952668092}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClusteringP2P", "type": "mteb/reddit-clustering-p2p", "config": "default", "split": "test", "revision": "282350215ef01743dc01b456c7f5241fa8937f16"}, "metrics": [{"type": "v_measure", "value": 62.094465801879295}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SCIDOCS", "type": "mteb/scidocs", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 5.688}, {"type": "map_at_10", "value": 15.201999999999998}, {"type": "map_at_100", "value": 18.096}, {"type": "map_at_1000", "value": 18.481}, {"type": "map_at_3", "value": 10.734}, {"type": "map_at_5", "value": 12.94}, {"type": "mrr_at_1", "value": 28.000000000000004}, {"type": "mrr_at_10", "value": 41.101}, {"type": "mrr_at_100", "value": 42.202}, {"type": "mrr_at_1000", "value": 42.228}, {"type": "mrr_at_3", "value": 37.683}, {"type": "mrr_at_5", "value": 39.708}, {"type": "ndcg_at_1", "value": 28.000000000000004}, {"type": "ndcg_at_10", "value": 24.976000000000003}, {"type": "ndcg_at_100", "value": 35.129}, {"type": "ndcg_at_1000", "value": 40.77}, {"type": "ndcg_at_3", "value": 23.787}, {"type": "ndcg_at_5", "value": 20.816000000000003}, {"type": "precision_at_1", "value": 28.000000000000004}, {"type": "precision_at_10", "value": 13.04}, {"type": "precision_at_100", "value": 2.761}, {"type": "precision_at_1000", "value": 0.41000000000000003}, {"type": "precision_at_3", "value": 22.6}, {"type": "precision_at_5", "value": 18.52}, {"type": "recall_at_1", "value": 5.688}, {"type": "recall_at_10", "value": 26.43}, {"type": "recall_at_100", "value": 56.02}, {"type": "recall_at_1000", "value": 83.21}, {"type": "recall_at_3", "value": 13.752}, {"type": "recall_at_5", "value": 18.777}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB SICK-R", "type": "mteb/sickr-sts", "config": "default", "split": "test", "revision": "a6ea5a8cab320b040a23452cc28066d9beae2cee"}, "metrics": [{"type": "cos_sim_pearson", "value": 85.15084859283178}, {"type": "cos_sim_spearman", "value": 80.49030614009419}, {"type": "euclidean_pearson", "value": 81.84574978672468}, {"type": "euclidean_spearman", "value": 79.89787150656818}, {"type": "manhattan_pearson", "value": 81.63076538567131}, {"type": "manhattan_spearman", "value": 79.69867352121841}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS12", "type": "mteb/sts12-sts", "config": "default", "split": "test", "revision": "a0d554a64d88156834ff5ae9920b964011b16384"}, "metrics": [{"type": "cos_sim_pearson", "value": 84.64097921490992}, {"type": "cos_sim_spearman", "value": 77.25370084896514}, {"type": "euclidean_pearson", "value": 82.71210826468788}, {"type": "euclidean_spearman", "value": 78.50445584994826}, {"type": "manhattan_pearson", "value": 82.92580164330298}, {"type": "manhattan_spearman", "value": 78.69686891301019}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS13", "type": "mteb/sts13-sts", "config": "default", "split": "test", "revision": "7e90230a92c190f1bf69ae9002b8cea547a64cca"}, "metrics": [{"type": "cos_sim_pearson", "value": 87.24596417308994}, {"type": "cos_sim_spearman", "value": 87.79454220555091}, {"type": "euclidean_pearson", "value": 87.40242561671164}, {"type": "euclidean_spearman", "value": 88.25955597373556}, {"type": "manhattan_pearson", "value": 87.25160240485849}, {"type": "manhattan_spearman", "value": 88.155794979818}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS14", "type": "mteb/sts14-sts", "config": "default", "split": "test", "revision": "6031580fec1f6af667f0bd2da0a551cf4f0b2375"}, "metrics": [{"type": "cos_sim_pearson", "value": 84.44914233422564}, {"type": "cos_sim_spearman", "value": 82.91015471820322}, {"type": "euclidean_pearson", "value": 84.7206656630327}, {"type": "euclidean_spearman", "value": 83.86408872059216}, {"type": "manhattan_pearson", "value": 84.72816725158454}, {"type": "manhattan_spearman", "value": 84.01603388572788}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS15", "type": "mteb/sts15-sts", "config": "default", "split": "test", "revision": "ae752c7c21bf194d8b67fd573edf7ae58183cbe3"}, "metrics": [{"type": "cos_sim_pearson", "value": 87.6168026237477}, {"type": "cos_sim_spearman", "value": 88.45414278092397}, {"type": "euclidean_pearson", "value": 88.57023240882022}, {"type": "euclidean_spearman", "value": 89.04102190922094}, {"type": "manhattan_pearson", "value": 88.66695535796354}, {"type": "manhattan_spearman", "value": 89.19898476680969}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS16", "type": "mteb/sts16-sts", "config": "default", "split": "test", "revision": "4d8694f8f0e0100860b497b999b3dbed754a0513"}, "metrics": [{"type": "cos_sim_pearson", "value": 84.27925826089424}, {"type": "cos_sim_spearman", "value": 85.45291099550461}, {"type": "euclidean_pearson", "value": 83.63853036580834}, {"type": "euclidean_spearman", "value": 84.33468035821484}, {"type": "manhattan_pearson", "value": 83.72778773251596}, {"type": "manhattan_spearman", "value": 84.51583132445376}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS17 (en-en)", "type": "mteb/sts17-crosslingual-sts", "config": "en-en", "split": "test", "revision": "af5e6fb845001ecf41f4c1e033ce921939a2a68d"}, "metrics": [{"type": "cos_sim_pearson", "value": 89.67375185692552}, {"type": "cos_sim_spearman", "value": 90.32542469203855}, {"type": "euclidean_pearson", "value": 89.63513717951847}, {"type": "euclidean_spearman", "value": 89.87760271003745}, {"type": "manhattan_pearson", "value": 89.28381452982924}, {"type": "manhattan_spearman", "value": 89.53568197785721}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (en)", "type": "mteb/sts22-crosslingual-sts", "config": "en", "split": "test", "revision": "eea2b4fe26a775864c896887d910b76a8098ad3f"}, "metrics": [{"type": "cos_sim_pearson", "value": 66.24644693819846}, {"type": "cos_sim_spearman", "value": 66.09889420525377}, {"type": "euclidean_pearson", "value": 63.72551583520747}, {"type": "euclidean_spearman", "value": 63.01385470780679}, {"type": "manhattan_pearson", "value": 64.09258157214097}, {"type": "manhattan_spearman", "value": 63.080517752822594}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSBenchmark", "type": "mteb/stsbenchmark-sts", "config": "default", "split": "test", "revision": "b0fddb56ed78048fa8b90373c8a3cfc37b684831"}, "metrics": [{"type": "cos_sim_pearson", "value": 86.27321463839989}, {"type": "cos_sim_spearman", "value": 86.37572865993327}, {"type": "euclidean_pearson", "value": 86.36268020198149}, {"type": "euclidean_spearman", "value": 86.31089339478922}, {"type": "manhattan_pearson", "value": 86.4260445761947}, {"type": "manhattan_spearman", "value": 86.45885895320457}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB SciDocsRR", "type": "mteb/scidocs-reranking", "config": "default", "split": "test", "revision": "d3c5e1fc0b855ab6097bf1cda04dd73947d7caab"}, "metrics": [{"type": "map", "value": 86.52456702387798}, {"type": "mrr", "value": 96.34556529164372}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SciFact", "type": "mteb/scifact", "config": "default", "split": "test", "revision": "0228b52cf27578f30900b9e5271d331663a030d7"}, "metrics": [{"type": "map_at_1", "value": 61.99400000000001}, {"type": "map_at_10", "value": 73.38799999999999}, {"type": "map_at_100", "value": 73.747}, {"type": "map_at_1000", "value": 73.75}, {"type": "map_at_3", "value": 70.04599999999999}, {"type": "map_at_5", "value": 72.095}, {"type": "mrr_at_1", "value": 65.0}, {"type": "mrr_at_10", "value": 74.42800000000001}, {"type": "mrr_at_100", "value": 74.722}, {"type": "mrr_at_1000", "value": 74.725}, {"type": "mrr_at_3", "value": 72.056}, {"type": "mrr_at_5", "value": 73.60600000000001}, {"type": "ndcg_at_1", "value": 65.0}, {"type": "ndcg_at_10", "value": 78.435}, {"type": "ndcg_at_100", "value": 79.922}, {"type": "ndcg_at_1000", "value": 80.00500000000001}, {"type": "ndcg_at_3", "value": 73.05199999999999}, {"type": "ndcg_at_5", "value": 75.98}, {"type": "precision_at_1", "value": 65.0}, {"type": "precision_at_10", "value": 10.5}, {"type": "precision_at_100", "value": 1.123}, {"type": "precision_at_1000", "value": 0.11299999999999999}, {"type": "precision_at_3", "value": 28.555999999999997}, {"type": "precision_at_5", "value": 19.0}, {"type": "recall_at_1", "value": 61.99400000000001}, {"type": "recall_at_10", "value": 92.72200000000001}, {"type": "recall_at_100", "value": 99.333}, {"type": "recall_at_1000", "value": 100.0}, {"type": "recall_at_3", "value": 78.739}, {"type": "recall_at_5", "value": 85.828}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB SprintDuplicateQuestions", "type": "mteb/sprintduplicatequestions-pairclassification", "config": "default", "split": "test", "revision": "d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46"}, "metrics": [{"type": "cos_sim_accuracy", "value": 99.79009900990098}, {"type": "cos_sim_ap", "value": 95.3203137438653}, {"type": "cos_sim_f1", "value": 89.12386706948641}, {"type": "cos_sim_precision", "value": 89.75659229208925}, {"type": "cos_sim_recall", "value": 88.5}, {"type": "dot_accuracy", "value": 99.67821782178218}, {"type": "dot_ap", "value": 89.94069840000675}, {"type": "dot_f1", "value": 83.45902463549521}, {"type": "dot_precision", "value": 83.9231547017189}, {"type": "dot_recall", "value": 83.0}, {"type": "euclidean_accuracy", "value": 99.78613861386138}, {"type": "euclidean_ap", "value": 95.10648259135526}, {"type": "euclidean_f1", "value": 88.77338877338877}, {"type": "euclidean_precision", "value": 92.42424242424242}, {"type": "euclidean_recall", "value": 85.39999999999999}, {"type": "manhattan_accuracy", "value": 99.7950495049505}, {"type": "manhattan_ap", "value": 95.29987661320946}, {"type": "manhattan_f1", "value": 89.21313183949972}, {"type": "manhattan_precision", "value": 93.14472252448314}, {"type": "manhattan_recall", "value": 85.6}, {"type": "max_accuracy", "value": 99.7950495049505}, {"type": "max_ap", "value": 95.3203137438653}, {"type": "max_f1", "value": 89.21313183949972}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClustering", "type": "mteb/stackexchange-clustering", "config": "default", "split": "test", "revision": "6cbc1f7b2bc0622f2e39d2c77fa502909748c259"}, "metrics": [{"type": "v_measure", "value": 67.65446577183913}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClusteringP2P", "type": "mteb/stackexchange-clustering-p2p", "config": "default", "split": "test", "revision": "815ca46b2622cec33ccafc3735d572c266efdb44"}, "metrics": [{"type": "v_measure", "value": 46.30749237193961}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB StackOverflowDupQuestions", "type": "mteb/stackoverflowdupquestions-reranking", "config": "default", "split": "test", "revision": "e185fbe320c72810689fc5848eb6114e1ef5ec69"}, "metrics": [{"type": "map", "value": 54.91481849959949}, {"type": "mrr", "value": 55.853506175197346}]}, {"task": {"type": "Summarization"}, "dataset": {"name": "MTEB SummEval", "type": "mteb/summeval", "config": "default", "split": "test", "revision": "cda12ad7615edc362dbf25a00fdd61d3b1eaf93c"}, "metrics": [{"type": "cos_sim_pearson", "value": 30.08196549170419}, {"type": "cos_sim_spearman", "value": 31.16661390597077}, {"type": "dot_pearson", "value": 29.892258410943466}, {"type": "dot_spearman", "value": 30.51328811965085}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB TRECCOVID", "type": "mteb/trec-covid", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 0.23900000000000002}, {"type": "map_at_10", "value": 2.173}, {"type": "map_at_100", "value": 14.24}, {"type": "map_at_1000", "value": 35.309000000000005}, {"type": "map_at_3", "value": 0.7100000000000001}, {"type": "map_at_5", "value": 1.163}, {"type": "mrr_at_1", "value": 92.0}, {"type": "mrr_at_10", "value": 96.0}, {"type": "mrr_at_100", "value": 96.0}, {"type": "mrr_at_1000", "value": 96.0}, {"type": "mrr_at_3", "value": 96.0}, {"type": "mrr_at_5", "value": 96.0}, {"type": "ndcg_at_1", "value": 90.0}, {"type": "ndcg_at_10", "value": 85.382}, {"type": "ndcg_at_100", "value": 68.03}, {"type": "ndcg_at_1000", "value": 61.021}, {"type": "ndcg_at_3", "value": 89.765}, {"type": "ndcg_at_5", "value": 88.444}, {"type": "precision_at_1", "value": 92.0}, {"type": "precision_at_10", "value": 88.0}, {"type": "precision_at_100", "value": 70.02000000000001}, {"type": "precision_at_1000", "value": 26.984}, {"type": "precision_at_3", "value": 94.0}, {"type": "precision_at_5", "value": 92.80000000000001}, {"type": "recall_at_1", "value": 0.23900000000000002}, {"type": "recall_at_10", "value": 2.313}, {"type": "recall_at_100", "value": 17.049}, {"type": "recall_at_1000", "value": 57.489999999999995}, {"type": "recall_at_3", "value": 0.737}, {"type": "recall_at_5", "value": 1.221}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB Touche2020", "type": "mteb/touche2020", "config": "default", "split": "test", "revision": "a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f"}, "metrics": [{"type": "map_at_1", "value": 2.75}, {"type": "map_at_10", "value": 11.29}, {"type": "map_at_100", "value": 18.032999999999998}, {"type": "map_at_1000", "value": 19.746}, {"type": "map_at_3", "value": 6.555}, {"type": "map_at_5", "value": 8.706999999999999}, {"type": "mrr_at_1", "value": 34.694}, {"type": "mrr_at_10", "value": 50.55}, {"type": "mrr_at_100", "value": 51.659}, {"type": "mrr_at_1000", "value": 51.659}, {"type": "mrr_at_3", "value": 47.278999999999996}, {"type": "mrr_at_5", "value": 49.728}, {"type": "ndcg_at_1", "value": 32.653}, {"type": "ndcg_at_10", "value": 27.894000000000002}, {"type": "ndcg_at_100", "value": 39.769}, {"type": "ndcg_at_1000", "value": 51.495999999999995}, {"type": "ndcg_at_3", "value": 32.954}, {"type": "ndcg_at_5", "value": 31.502999999999997}, {"type": "precision_at_1", "value": 34.694}, {"type": "precision_at_10", "value": 23.265}, {"type": "precision_at_100", "value": 7.898}, {"type": "precision_at_1000", "value": 1.58}, {"type": "precision_at_3", "value": 34.694}, {"type": "precision_at_5", "value": 31.429000000000002}, {"type": "recall_at_1", "value": 2.75}, {"type": "recall_at_10", "value": 16.953}, {"type": "recall_at_100", "value": 48.68}, {"type": "recall_at_1000", "value": 85.18599999999999}, {"type": "recall_at_3", "value": 7.710999999999999}, {"type": "recall_at_5", "value": 11.484}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ToxicConversationsClassification", "type": "mteb/toxic_conversations_50k", "config": "default", "split": "test", "revision": "d7c0de2777da35d6aae2200a62c6e0e5af397c4c"}, "metrics": [{"type": "accuracy", "value": 82.66099999999999}, {"type": "ap", "value": 25.555698090238337}, {"type": "f1", "value": 66.48402012461622}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB TweetSentimentExtractionClassification", "type": "mteb/tweet_sentiment_extraction", "config": "default", "split": "test", "revision": "d604517c81ca91fe16a244d1248fc021f9ecee7a"}, "metrics": [{"type": "accuracy", "value": 72.94567062818335}, {"type": "f1", "value": 73.28139189595674}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB TwentyNewsgroupsClustering", "type": "mteb/twentynewsgroups-clustering", "config": "default", "split": "test", "revision": "6125ec4e24fa026cec8a478383ee943acfbd5449"}, "metrics": [{"type": "v_measure", "value": 49.581627240203474}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterSemEval2015", "type": "mteb/twittersemeval2015-pairclassification", "config": "default", "split": "test", "revision": "70970daeab8776df92f5ea462b6173c0b46fd2d1"}, "metrics": [{"type": "cos_sim_accuracy", "value": 87.78089050485785}, {"type": "cos_sim_ap", "value": 79.64487116574168}, {"type": "cos_sim_f1", "value": 72.46563021970964}, {"type": "cos_sim_precision", "value": 70.62359128474831}, {"type": "cos_sim_recall", "value": 74.40633245382587}, {"type": "dot_accuracy", "value": 86.2609524944865}, {"type": "dot_ap", "value": 75.513046857613}, {"type": "dot_f1", "value": 68.58213616489695}, {"type": "dot_precision", "value": 65.12455516014235}, {"type": "dot_recall", "value": 72.42744063324538}, {"type": "euclidean_accuracy", "value": 87.6080348095607}, {"type": "euclidean_ap", "value": 79.00204933649795}, {"type": "euclidean_f1", "value": 72.14495342605589}, {"type": "euclidean_precision", "value": 69.85421299728193}, {"type": "euclidean_recall", "value": 74.5910290237467}, {"type": "manhattan_accuracy", "value": 87.59611372712642}, {"type": "manhattan_ap", "value": 78.78523756706264}, {"type": "manhattan_f1", "value": 71.86499137718648}, {"type": "manhattan_precision", "value": 67.39833641404806}, {"type": "manhattan_recall", "value": 76.96569920844327}, {"type": "max_accuracy", "value": 87.78089050485785}, {"type": "max_ap", "value": 79.64487116574168}, {"type": "max_f1", "value": 72.46563021970964}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterURLCorpus", "type": "mteb/twitterurlcorpus-pairclassification", "config": "default", "split": "test", "revision": "8b6510b0b1fa4e4c4f879467980e9be563ec1cdf"}, "metrics": [{"type": "cos_sim_accuracy", "value": 89.98719292117825}, {"type": "cos_sim_ap", "value": 87.58146137353202}, {"type": "cos_sim_f1", "value": 80.28543232369239}, {"type": "cos_sim_precision", "value": 79.1735289714029}, {"type": "cos_sim_recall", "value": 81.42901139513397}, {"type": "dot_accuracy", "value": 88.9199363526992}, {"type": "dot_ap", "value": 84.98499998630417}, {"type": "dot_f1", "value": 78.21951400757969}, {"type": "dot_precision", "value": 75.58523624874336}, {"type": "dot_recall", "value": 81.04404065291038}, {"type": "euclidean_accuracy", "value": 89.77374160748244}, {"type": "euclidean_ap", "value": 87.35151562835209}, {"type": "euclidean_f1", "value": 79.92160922940393}, {"type": "euclidean_precision", "value": 76.88531587933979}, {"type": "euclidean_recall", "value": 83.20757622420696}, {"type": "manhattan_accuracy", "value": 89.72717041176699}, {"type": "manhattan_ap", "value": 87.34065592142515}, {"type": "manhattan_f1", "value": 79.85603419187943}, {"type": "manhattan_precision", "value": 77.82243332115455}, {"type": "manhattan_recall", "value": 81.99876809362489}, {"type": "max_accuracy", "value": 89.98719292117825}, {"type": "max_ap", "value": 87.58146137353202}, {"type": "max_f1", "value": 80.28543232369239}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB AFQMC", "type": "C-MTEB/AFQMC", "config": "default", "split": "validation", "revision": "b44c3b011063adb25877c13823db83bb193913c4"}, "metrics": [{"type": "cos_sim_pearson", "value": 53.45954203592337}, {"type": "cos_sim_spearman", "value": 58.42154680418638}, {"type": "euclidean_pearson", "value": 56.41543791722753}, {"type": "euclidean_spearman", "value": 58.39328016640146}, {"type": "manhattan_pearson", "value": 56.318510356833876}, {"type": "manhattan_spearman", "value": 58.28423447818184}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB ATEC", "type": "C-MTEB/ATEC", "config": "default", "split": "test", "revision": "0f319b1142f28d00e055a6770f3f726ae9b7d865"}, "metrics": [{"type": "cos_sim_pearson", "value": 50.78356460675945}, {"type": "cos_sim_spearman", "value": 55.6530411663269}, {"type": "euclidean_pearson", "value": 56.50763660417816}, {"type": "euclidean_spearman", "value": 55.733823335669065}, {"type": "manhattan_pearson", "value": 56.45323093512866}, {"type": "manhattan_spearman", "value": 55.63248619032702}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification (zh)", "type": "mteb/amazon_reviews_multi", "config": "zh", "split": "test", "revision": "1399c76144fd37290681b995c656ef9b2e06e26d"}, "metrics": [{"type": "accuracy", "value": 47.209999999999994}, {"type": "f1", "value": 46.08892432018655}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB BQ", "type": "C-MTEB/BQ", "config": "default", "split": "test", "revision": "e3dda5e115e487b39ec7e618c0c6a29137052a55"}, "metrics": [{"type": "cos_sim_pearson", "value": 70.25573992001478}, {"type": "cos_sim_spearman", "value": 73.85247134951433}, {"type": "euclidean_pearson", "value": 72.60033082168442}, {"type": "euclidean_spearman", "value": 73.72445893756499}, {"type": "manhattan_pearson", "value": 72.59932284620231}, {"type": "manhattan_spearman", "value": 73.68002490614583}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB CLSClusteringP2P", "type": "C-MTEB/CLSClusteringP2P", "config": "default", "split": "test", "revision": "4b6227591c6c1a73bc76b1055f3b7f3588e72476"}, "metrics": [{"type": "v_measure", "value": 45.21317724305628}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB CLSClusteringS2S", "type": "C-MTEB/CLSClusteringS2S", "config": "default", "split": "test", "revision": "e458b3f5414b62b7f9f83499ac1f5497ae2e869f"}, "metrics": [{"type": "v_measure", "value": 42.49825170976724}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB CMedQAv1", "type": "C-MTEB/CMedQAv1-reranking", "config": "default", "split": "test", "revision": "8d7f1e942507dac42dc58017c1a001c3717da7df"}, "metrics": [{"type": "map", "value": 88.15661686810597}, {"type": "mrr", "value": 90.11222222222223}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB CMedQAv2", "type": "C-MTEB/CMedQAv2-reranking", "config": "default", "split": "test", "revision": "23d186750531a14a0357ca22cd92d712fd512ea0"}, "metrics": [{"type": "map", "value": 88.1204726064383}, {"type": "mrr", "value": 90.20142857142858}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CmedqaRetrieval", "type": "C-MTEB/CmedqaRetrieval", "config": "default", "split": "dev", "revision": "cd540c506dae1cf9e9a59c3e06f42030d54e7301"}, "metrics": [{"type": "map_at_1", "value": 27.224999999999998}, {"type": "map_at_10", "value": 40.169}, {"type": "map_at_100", "value": 42.0}, {"type": "map_at_1000", "value": 42.109}, {"type": "map_at_3", "value": 35.76}, {"type": "map_at_5", "value": 38.221}, {"type": "mrr_at_1", "value": 40.56}, {"type": "mrr_at_10", "value": 49.118}, {"type": "mrr_at_100", "value": 50.092999999999996}, {"type": "mrr_at_1000", "value": 50.133}, {"type": "mrr_at_3", "value": 46.507}, {"type": "mrr_at_5", "value": 47.973}, {"type": "ndcg_at_1", "value": 40.56}, {"type": "ndcg_at_10", "value": 46.972}, {"type": "ndcg_at_100", "value": 54.04}, {"type": "ndcg_at_1000", "value": 55.862}, {"type": "ndcg_at_3", "value": 41.36}, {"type": "ndcg_at_5", "value": 43.704}, {"type": "precision_at_1", "value": 40.56}, {"type": "precision_at_10", "value": 10.302999999999999}, {"type": "precision_at_100", "value": 1.606}, {"type": "precision_at_1000", "value": 0.184}, {"type": "precision_at_3", "value": 23.064}, {"type": "precision_at_5", "value": 16.764000000000003}, {"type": "recall_at_1", "value": 27.224999999999998}, {"type": "recall_at_10", "value": 58.05200000000001}, {"type": "recall_at_100", "value": 87.092}, {"type": "recall_at_1000", "value": 99.099}, {"type": "recall_at_3", "value": 41.373}, {"type": "recall_at_5", "value": 48.453}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB Cmnli", "type": "C-MTEB/CMNLI", "config": "default", "split": "validation", "revision": "41bc36f332156f7adc9e38f53777c959b2ae9766"}, "metrics": [{"type": "cos_sim_accuracy", "value": 77.40228502705953}, {"type": "cos_sim_ap", "value": 86.22359172956327}, {"type": "cos_sim_f1", "value": 78.96328293736501}, {"type": "cos_sim_precision", "value": 73.36945615091311}, {"type": "cos_sim_recall", "value": 85.48047696983868}, {"type": "dot_accuracy", "value": 75.53818400481059}, {"type": "dot_ap", "value": 83.70164011305312}, {"type": "dot_f1", "value": 77.67298719348754}, {"type": "dot_precision", "value": 67.49482401656314}, {"type": "dot_recall", "value": 91.46598082768296}, {"type": "euclidean_accuracy", "value": 77.94347564642213}, {"type": "euclidean_ap", "value": 86.4652108728609}, {"type": "euclidean_f1", "value": 79.15555555555555}, {"type": "euclidean_precision", "value": 75.41816641964853}, {"type": "euclidean_recall", "value": 83.28267477203647}, {"type": "manhattan_accuracy", "value": 77.45039085989175}, {"type": "manhattan_ap", "value": 86.09986583900665}, {"type": "manhattan_f1", "value": 78.93669264438988}, {"type": "manhattan_precision", "value": 72.63261296660117}, {"type": "manhattan_recall", "value": 86.43909282207154}, {"type": "max_accuracy", "value": 77.94347564642213}, {"type": "max_ap", "value": 86.4652108728609}, {"type": "max_f1", "value": 79.15555555555555}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CovidRetrieval", "type": "C-MTEB/CovidRetrieval", "config": "default", "split": "dev", "revision": "1271c7809071a13532e05f25fb53511ffce77117"}, "metrics": [{"type": "map_at_1", "value": 69.336}, {"type": "map_at_10", "value": 77.16}, {"type": "map_at_100", "value": 77.47500000000001}, {"type": "map_at_1000", "value": 77.482}, {"type": "map_at_3", "value": 75.42999999999999}, {"type": "map_at_5", "value": 76.468}, {"type": "mrr_at_1", "value": 69.44200000000001}, {"type": "mrr_at_10", "value": 77.132}, {"type": "mrr_at_100", "value": 77.43299999999999}, {"type": "mrr_at_1000", "value": 77.44}, {"type": "mrr_at_3", "value": 75.395}, {"type": "mrr_at_5", "value": 76.459}, {"type": "ndcg_at_1", "value": 69.547}, {"type": "ndcg_at_10", "value": 80.794}, {"type": "ndcg_at_100", "value": 82.245}, {"type": "ndcg_at_1000", "value": 82.40899999999999}, {"type": "ndcg_at_3", "value": 77.303}, {"type": "ndcg_at_5", "value": 79.168}, {"type": "precision_at_1", "value": 69.547}, {"type": "precision_at_10", "value": 9.305}, {"type": "precision_at_100", "value": 0.9979999999999999}, {"type": "precision_at_1000", "value": 0.101}, {"type": "precision_at_3", "value": 27.749000000000002}, {"type": "precision_at_5", "value": 17.576}, {"type": "recall_at_1", "value": 69.336}, {"type": "recall_at_10", "value": 92.097}, {"type": "recall_at_100", "value": 98.736}, {"type": "recall_at_1000", "value": 100.0}, {"type": "recall_at_3", "value": 82.64}, {"type": "recall_at_5", "value": 87.144}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB DuRetrieval", "type": "C-MTEB/DuRetrieval", "config": "default", "split": "dev", "revision": "a1a333e290fe30b10f3f56498e3a0d911a693ced"}, "metrics": [{"type": "map_at_1", "value": 26.817999999999998}, {"type": "map_at_10", "value": 82.67}, {"type": "map_at_100", "value": 85.304}, {"type": "map_at_1000", "value": 85.334}, {"type": "map_at_3", "value": 57.336}, {"type": "map_at_5", "value": 72.474}, {"type": "mrr_at_1", "value": 91.45}, {"type": "mrr_at_10", "value": 94.272}, {"type": "mrr_at_100", "value": 94.318}, {"type": "mrr_at_1000", "value": 94.32000000000001}, {"type": "mrr_at_3", "value": 94.0}, {"type": "mrr_at_5", "value": 94.17699999999999}, {"type": "ndcg_at_1", "value": 91.45}, {"type": "ndcg_at_10", "value": 89.404}, {"type": "ndcg_at_100", "value": 91.724}, {"type": "ndcg_at_1000", "value": 91.973}, {"type": "ndcg_at_3", "value": 88.104}, {"type": "ndcg_at_5", "value": 87.25699999999999}, {"type": "precision_at_1", "value": 91.45}, {"type": "precision_at_10", "value": 42.585}, {"type": "precision_at_100", "value": 4.838}, {"type": "precision_at_1000", "value": 0.49}, {"type": "precision_at_3", "value": 78.8}, {"type": "precision_at_5", "value": 66.66}, {"type": "recall_at_1", "value": 26.817999999999998}, {"type": "recall_at_10", "value": 90.67}, {"type": "recall_at_100", "value": 98.36200000000001}, {"type": "recall_at_1000", "value": 99.583}, {"type": "recall_at_3", "value": 59.614999999999995}, {"type": "recall_at_5", "value": 77.05199999999999}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB EcomRetrieval", "type": "C-MTEB/EcomRetrieval", "config": "default", "split": "dev", "revision": "687de13dc7294d6fd9be10c6945f9e8fec8166b9"}, "metrics": [{"type": "map_at_1", "value": 47.699999999999996}, {"type": "map_at_10", "value": 57.589999999999996}, {"type": "map_at_100", "value": 58.226}, {"type": "map_at_1000", "value": 58.251}, {"type": "map_at_3", "value": 55.233}, {"type": "map_at_5", "value": 56.633}, {"type": "mrr_at_1", "value": 47.699999999999996}, {"type": "mrr_at_10", "value": 57.589999999999996}, {"type": "mrr_at_100", "value": 58.226}, {"type": "mrr_at_1000", "value": 58.251}, {"type": "mrr_at_3", "value": 55.233}, {"type": "mrr_at_5", "value": 56.633}, {"type": "ndcg_at_1", "value": 47.699999999999996}, {"type": "ndcg_at_10", "value": 62.505}, {"type": "ndcg_at_100", "value": 65.517}, {"type": "ndcg_at_1000", "value": 66.19800000000001}, {"type": "ndcg_at_3", "value": 57.643}, {"type": "ndcg_at_5", "value": 60.181}, {"type": "precision_at_1", "value": 47.699999999999996}, {"type": "precision_at_10", "value": 7.8}, {"type": "precision_at_100", "value": 0.919}, {"type": "precision_at_1000", "value": 0.097}, {"type": "precision_at_3", "value": 21.532999999999998}, {"type": "precision_at_5", "value": 14.16}, {"type": "recall_at_1", "value": 47.699999999999996}, {"type": "recall_at_10", "value": 78.0}, {"type": "recall_at_100", "value": 91.9}, {"type": "recall_at_1000", "value": 97.3}, {"type": "recall_at_3", "value": 64.60000000000001}, {"type": "recall_at_5", "value": 70.8}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB IFlyTek", "type": "C-MTEB/IFlyTek-classification", "config": "default", "split": "validation", "revision": "421605374b29664c5fc098418fe20ada9bd55f8a"}, "metrics": [{"type": "accuracy", "value": 44.84801846864178}, {"type": "f1", "value": 37.47347897956339}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB JDReview", "type": "C-MTEB/JDReview-classification", "config": "default", "split": "test", "revision": "b7c64bd89eb87f8ded463478346f76731f07bf8b"}, "metrics": [{"type": "accuracy", "value": 85.81613508442777}, {"type": "ap", "value": 52.68244615477374}, {"type": "f1", "value": 80.0445640948843}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB LCQMC", "type": "C-MTEB/LCQMC", "config": "default", "split": "test", "revision": "17f9b096f80380fce5ed12a9be8be7784b337daf"}, "metrics": [{"type": "cos_sim_pearson", "value": 69.57786502217138}, {"type": "cos_sim_spearman", "value": 75.39106054489906}, {"type": "euclidean_pearson", "value": 73.72082954602402}, {"type": "euclidean_spearman", "value": 75.14421475913619}, {"type": "manhattan_pearson", "value": 73.62463076633642}, {"type": "manhattan_spearman", "value": 75.01301565104112}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB MMarcoReranking", "type": "C-MTEB/Mmarco-reranking", "config": "default", "split": "dev", "revision": "None"}, "metrics": [{"type": "map", "value": 29.143797057999134}, {"type": "mrr", "value": 28.08174603174603}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MMarcoRetrieval", "type": "C-MTEB/MMarcoRetrieval", "config": "default", "split": "dev", "revision": "539bbde593d947e2a124ba72651aafc09eb33fc2"}, "metrics": [{"type": "map_at_1", "value": 70.492}, {"type": "map_at_10", "value": 79.501}, {"type": "map_at_100", "value": 79.728}, {"type": "map_at_1000", "value": 79.735}, {"type": "map_at_3", "value": 77.77}, {"type": "map_at_5", "value": 78.851}, {"type": "mrr_at_1", "value": 72.822}, {"type": "mrr_at_10", "value": 80.001}, {"type": "mrr_at_100", "value": 80.19}, {"type": "mrr_at_1000", "value": 80.197}, {"type": "mrr_at_3", "value": 78.484}, {"type": "mrr_at_5", "value": 79.42099999999999}, {"type": "ndcg_at_1", "value": 72.822}, {"type": "ndcg_at_10", "value": 83.013}, {"type": "ndcg_at_100", "value": 84.013}, {"type": "ndcg_at_1000", "value": 84.20400000000001}, {"type": "ndcg_at_3", "value": 79.728}, {"type": "ndcg_at_5", "value": 81.542}, {"type": "precision_at_1", "value": 72.822}, {"type": "precision_at_10", "value": 9.917}, {"type": "precision_at_100", "value": 1.042}, {"type": "precision_at_1000", "value": 0.106}, {"type": "precision_at_3", "value": 29.847}, {"type": "precision_at_5", "value": 18.871}, {"type": "recall_at_1", "value": 70.492}, {"type": "recall_at_10", "value": 93.325}, {"type": "recall_at_100", "value": 97.822}, {"type": "recall_at_1000", "value": 99.319}, {"type": "recall_at_3", "value": 84.636}, {"type": "recall_at_5", "value": 88.93100000000001}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (zh-CN)", "type": "mteb/amazon_massive_intent", "config": "zh-CN", "split": "test", "revision": "31efe3c427b0bae9c22cbb560b8f15491cc6bed7"}, "metrics": [{"type": "accuracy", "value": 76.88298587760592}, {"type": "f1", "value": 73.89001762017176}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (zh-CN)", "type": "mteb/amazon_massive_scenario", "config": "zh-CN", "split": "test", "revision": "7d571f92784cd94a019292a1f45445077d0ef634"}, "metrics": [{"type": "accuracy", "value": 80.76328177538669}, {"type": "f1", "value": 80.24718532423358}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MedicalRetrieval", "type": "C-MTEB/MedicalRetrieval", "config": "default", "split": "dev", "revision": "2039188fb5800a9803ba5048df7b76e6fb151fc6"}, "metrics": [{"type": "map_at_1", "value": 49.6}, {"type": "map_at_10", "value": 55.620999999999995}, {"type": "map_at_100", "value": 56.204}, {"type": "map_at_1000", "value": 56.251}, {"type": "map_at_3", "value": 54.132999999999996}, {"type": "map_at_5", "value": 54.933}, {"type": "mrr_at_1", "value": 49.7}, {"type": "mrr_at_10", "value": 55.67100000000001}, {"type": "mrr_at_100", "value": 56.254000000000005}, {"type": "mrr_at_1000", "value": 56.301}, {"type": "mrr_at_3", "value": 54.18300000000001}, {"type": "mrr_at_5", "value": 54.983000000000004}, {"type": "ndcg_at_1", "value": 49.6}, {"type": "ndcg_at_10", "value": 58.645}, {"type": "ndcg_at_100", "value": 61.789}, {"type": "ndcg_at_1000", "value": 63.219}, {"type": "ndcg_at_3", "value": 55.567}, {"type": "ndcg_at_5", "value": 57.008}, {"type": "precision_at_1", "value": 49.6}, {"type": "precision_at_10", "value": 6.819999999999999}, {"type": "precision_at_100", "value": 0.836}, {"type": "precision_at_1000", "value": 0.095}, {"type": "precision_at_3", "value": 19.900000000000002}, {"type": "precision_at_5", "value": 12.64}, {"type": "recall_at_1", "value": 49.6}, {"type": "recall_at_10", "value": 68.2}, {"type": "recall_at_100", "value": 83.6}, {"type": "recall_at_1000", "value": 95.3}, {"type": "recall_at_3", "value": 59.699999999999996}, {"type": "recall_at_5", "value": 63.2}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MultilingualSentiment", "type": "C-MTEB/MultilingualSentiment-classification", "config": "default", "split": "validation", "revision": "46958b007a63fdbf239b7672c25d0bea67b5ea1a"}, "metrics": [{"type": "accuracy", "value": 74.45666666666666}, {"type": "f1", "value": 74.32582402190089}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB Ocnli", "type": "C-MTEB/OCNLI", "config": "default", "split": "validation", "revision": "66e76a618a34d6d565d5538088562851e6daa7ec"}, "metrics": [{"type": "cos_sim_accuracy", "value": 80.67135896047645}, {"type": "cos_sim_ap", "value": 87.60421240712051}, {"type": "cos_sim_f1", "value": 82.1304131408661}, {"type": "cos_sim_precision", "value": 77.68361581920904}, {"type": "cos_sim_recall", "value": 87.11721224920802}, {"type": "dot_accuracy", "value": 79.04710341093666}, {"type": "dot_ap", "value": 85.6370059719336}, {"type": "dot_f1", "value": 80.763723150358}, {"type": "dot_precision", "value": 73.69337979094077}, {"type": "dot_recall", "value": 89.33474128827878}, {"type": "euclidean_accuracy", "value": 81.05035192203573}, {"type": "euclidean_ap", "value": 87.7880240053663}, {"type": "euclidean_f1", "value": 82.50244379276637}, {"type": "euclidean_precision", "value": 76.7970882620564}, {"type": "euclidean_recall", "value": 89.1235480464625}, {"type": "manhattan_accuracy", "value": 80.61721710882512}, {"type": "manhattan_ap", "value": 87.43568120591175}, {"type": "manhattan_f1", "value": 81.89526184538653}, {"type": "manhattan_precision", "value": 77.5992438563327}, {"type": "manhattan_recall", "value": 86.6948257655755}, {"type": "max_accuracy", "value": 81.05035192203573}, {"type": "max_ap", "value": 87.7880240053663}, {"type": "max_f1", "value": 82.50244379276637}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB OnlineShopping", "type": "C-MTEB/OnlineShopping-classification", "config": "default", "split": "test", "revision": "e610f2ebd179a8fda30ae534c3878750a96db120"}, "metrics": [{"type": "accuracy", "value": 93.5}, {"type": "ap", "value": 91.31357903446782}, {"type": "f1", "value": 93.48088994006616}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB PAWSX", "type": "C-MTEB/PAWSX", "config": "default", "split": "test", "revision": "9c6a90e430ac22b5779fb019a23e820b11a8b5e1"}, "metrics": [{"type": "cos_sim_pearson", "value": 36.93293453538077}, {"type": "cos_sim_spearman", "value": 42.45972506308574}, {"type": "euclidean_pearson", "value": 42.34945133152159}, {"type": "euclidean_spearman", "value": 42.331610303674644}, {"type": "manhattan_pearson", "value": 42.31455070249498}, {"type": "manhattan_spearman", "value": 42.19887982891834}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB QBQTC", "type": "C-MTEB/QBQTC", "config": "default", "split": "test", "revision": "790b0510dc52b1553e8c49f3d2afb48c0e5c48b7"}, "metrics": [{"type": "cos_sim_pearson", "value": 33.683290790043785}, {"type": "cos_sim_spearman", "value": 35.149171171202994}, {"type": "euclidean_pearson", "value": 32.33806561267862}, {"type": "euclidean_spearman", "value": 34.483576387347966}, {"type": "manhattan_pearson", "value": 32.47629754599608}, {"type": "manhattan_spearman", "value": 34.66434471867615}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (zh)", "type": "mteb/sts22-crosslingual-sts", "config": "zh", "split": "test", "revision": "eea2b4fe26a775864c896887d910b76a8098ad3f"}, "metrics": [{"type": "cos_sim_pearson", "value": 66.46322760516104}, {"type": "cos_sim_spearman", "value": 67.398478319726}, {"type": "euclidean_pearson", "value": 64.7223480293625}, {"type": "euclidean_spearman", "value": 66.83118568812951}, {"type": "manhattan_pearson", "value": 64.88440039828305}, {"type": "manhattan_spearman", "value": 66.80429458952257}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSB", "type": "C-MTEB/STSB", "config": "default", "split": "test", "revision": "0cde68302b3541bb8b3c340dc0644b0b745b3dc0"}, "metrics": [{"type": "cos_sim_pearson", "value": 79.08991383232105}, {"type": "cos_sim_spearman", "value": 79.39715677296854}, {"type": "euclidean_pearson", "value": 78.63201279320496}, {"type": "euclidean_spearman", "value": 79.40262660785731}, {"type": "manhattan_pearson", "value": 78.98138363146906}, {"type": "manhattan_spearman", "value": 79.79968413014194}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB T2Reranking", "type": "C-MTEB/T2Reranking", "config": "default", "split": "dev", "revision": "76631901a18387f85eaa53e5450019b87ad58ef9"}, "metrics": [{"type": "map", "value": 67.43289278789972}, {"type": "mrr", "value": 77.53012460908535}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB T2Retrieval", "type": "C-MTEB/T2Retrieval", "config": "default", "split": "dev", "revision": "8731a845f1bf500a4f111cf1070785c793d10e64"}, "metrics": [{"type": "map_at_1", "value": 27.733999999999998}, {"type": "map_at_10", "value": 78.24799999999999}, {"type": "map_at_100", "value": 81.765}, {"type": "map_at_1000", "value": 81.824}, {"type": "map_at_3", "value": 54.92}, {"type": "map_at_5", "value": 67.61399999999999}, {"type": "mrr_at_1", "value": 90.527}, {"type": "mrr_at_10", "value": 92.843}, {"type": "mrr_at_100", "value": 92.927}, {"type": "mrr_at_1000", "value": 92.93}, {"type": "mrr_at_3", "value": 92.45100000000001}, {"type": "mrr_at_5", "value": 92.693}, {"type": "ndcg_at_1", "value": 90.527}, {"type": "ndcg_at_10", "value": 85.466}, {"type": "ndcg_at_100", "value": 88.846}, {"type": "ndcg_at_1000", "value": 89.415}, {"type": "ndcg_at_3", "value": 86.768}, {"type": "ndcg_at_5", "value": 85.46000000000001}, {"type": "precision_at_1", "value": 90.527}, {"type": "precision_at_10", "value": 42.488}, {"type": "precision_at_100", "value": 5.024}, {"type": "precision_at_1000", "value": 0.516}, {"type": "precision_at_3", "value": 75.907}, {"type": "precision_at_5", "value": 63.727000000000004}, {"type": "recall_at_1", "value": 27.733999999999998}, {"type": "recall_at_10", "value": 84.346}, {"type": "recall_at_100", "value": 95.536}, {"type": "recall_at_1000", "value": 98.42999999999999}, {"type": "recall_at_3", "value": 56.455}, {"type": "recall_at_5", "value": 70.755}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB TNews", "type": "C-MTEB/TNews-classification", "config": "default", "split": "validation", "revision": "317f262bf1e6126357bbe89e875451e4b0938fe4"}, "metrics": [{"type": "accuracy", "value": 49.952000000000005}, {"type": "f1", "value": 48.264617195258054}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ThuNewsClusteringP2P", "type": "C-MTEB/ThuNewsClusteringP2P", "config": "default", "split": "test", "revision": "5798586b105c0434e4f0fe5e767abe619442cf93"}, "metrics": [{"type": "v_measure", "value": 68.23769904483508}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ThuNewsClusteringS2S", "type": "C-MTEB/ThuNewsClusteringS2S", "config": "default", "split": "test", "revision": "8a8b2caeda43f39e13c4bc5bea0f8a667896e10d"}, "metrics": [{"type": "v_measure", "value": 62.50294403136556}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB VideoRetrieval", "type": "C-MTEB/VideoRetrieval", "config": "default", "split": "dev", "revision": "58c2597a5943a2ba48f4668c3b90d796283c5639"}, "metrics": [{"type": "map_at_1", "value": 54.0}, {"type": "map_at_10", "value": 63.668}, {"type": "map_at_100", "value": 64.217}, {"type": "map_at_1000", "value": 64.23100000000001}, {"type": "map_at_3", "value": 61.7}, {"type": "map_at_5", "value": 62.870000000000005}, {"type": "mrr_at_1", "value": 54.0}, {"type": "mrr_at_10", "value": 63.668}, {"type": "mrr_at_100", "value": 64.217}, {"type": "mrr_at_1000", "value": 64.23100000000001}, {"type": "mrr_at_3", "value": 61.7}, {"type": "mrr_at_5", "value": 62.870000000000005}, {"type": "ndcg_at_1", "value": 54.0}, {"type": "ndcg_at_10", "value": 68.11399999999999}, {"type": "ndcg_at_100", "value": 70.723}, {"type": "ndcg_at_1000", "value": 71.123}, {"type": "ndcg_at_3", "value": 64.074}, {"type": "ndcg_at_5", "value": 66.178}, {"type": "precision_at_1", "value": 54.0}, {"type": "precision_at_10", "value": 8.200000000000001}, {"type": "precision_at_100", "value": 0.941}, {"type": "precision_at_1000", "value": 0.097}, {"type": "precision_at_3", "value": 23.633000000000003}, {"type": "precision_at_5", "value": 15.2}, {"type": "recall_at_1", "value": 54.0}, {"type": "recall_at_10", "value": 82.0}, {"type": "recall_at_100", "value": 94.1}, {"type": "recall_at_1000", "value": 97.3}, {"type": "recall_at_3", "value": 70.89999999999999}, {"type": "recall_at_5", "value": 76.0}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB Waimai", "type": "C-MTEB/waimai-classification", "config": "default", "split": "test", "revision": "339287def212450dcaa9df8c22bf93e9980c7023"}, "metrics": [{"type": "accuracy", "value": 86.63000000000001}, {"type": "ap", "value": 69.99457882599567}, {"type": "f1", "value": 85.07735617998541}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB 8TagsClustering", "type": "PL-MTEB/8tags-clustering", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "v_measure", "value": 44.594104491193555}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AllegroReviews", "type": "PL-MTEB/allegro-reviews", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "accuracy", "value": 63.97614314115309}, {"type": "f1", "value": 52.15634261679283}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ArguAna-PL", "type": "clarin-knext/arguana-pl", "config": "default", "split": "test", "revision": "63fc86750af76253e8c760fc9e534bbf24d260a2"}, "metrics": [{"type": "map_at_1", "value": 32.646}, {"type": "map_at_10", "value": 47.963}, {"type": "map_at_100", "value": 48.789}, {"type": "map_at_1000", "value": 48.797000000000004}, {"type": "map_at_3", "value": 43.196}, {"type": "map_at_5", "value": 46.016}, {"type": "mrr_at_1", "value": 33.073}, {"type": "mrr_at_10", "value": 48.126000000000005}, {"type": "mrr_at_100", "value": 48.946}, {"type": "mrr_at_1000", "value": 48.953}, {"type": "mrr_at_3", "value": 43.374}, {"type": "mrr_at_5", "value": 46.147}, {"type": "ndcg_at_1", "value": 32.646}, {"type": "ndcg_at_10", "value": 56.481}, {"type": "ndcg_at_100", "value": 59.922}, {"type": "ndcg_at_1000", "value": 60.07}, {"type": "ndcg_at_3", "value": 46.675}, {"type": "ndcg_at_5", "value": 51.76500000000001}, {"type": "precision_at_1", "value": 32.646}, {"type": "precision_at_10", "value": 8.371}, {"type": "precision_at_100", "value": 0.9860000000000001}, {"type": "precision_at_1000", "value": 0.1}, {"type": "precision_at_3", "value": 18.919}, {"type": "precision_at_5", "value": 13.825999999999999}, {"type": "recall_at_1", "value": 32.646}, {"type": "recall_at_10", "value": 83.71300000000001}, {"type": "recall_at_100", "value": 98.578}, {"type": "recall_at_1000", "value": 99.644}, {"type": "recall_at_3", "value": 56.757000000000005}, {"type": "recall_at_5", "value": 69.132}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB CBD", "type": "PL-MTEB/cbd", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "accuracy", "value": 68.56}, {"type": "ap", "value": 23.310493680488513}, {"type": "f1", "value": 58.85369533105693}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB CDSC-E", "type": "PL-MTEB/cdsce-pairclassification", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "cos_sim_accuracy", "value": 88.5}, {"type": "cos_sim_ap", "value": 72.42140924378361}, {"type": "cos_sim_f1", "value": 66.0919540229885}, {"type": "cos_sim_precision", "value": 72.78481012658227}, {"type": "cos_sim_recall", "value": 60.526315789473685}, {"type": "dot_accuracy", "value": 88.5}, {"type": "dot_ap", "value": 72.42140924378361}, {"type": "dot_f1", "value": 66.0919540229885}, {"type": "dot_precision", "value": 72.78481012658227}, {"type": "dot_recall", "value": 60.526315789473685}, {"type": "euclidean_accuracy", "value": 88.5}, {"type": "euclidean_ap", "value": 72.42140924378361}, {"type": "euclidean_f1", "value": 66.0919540229885}, {"type": "euclidean_precision", "value": 72.78481012658227}, {"type": "euclidean_recall", "value": 60.526315789473685}, {"type": "manhattan_accuracy", "value": 88.5}, {"type": "manhattan_ap", "value": 72.49745515311696}, {"type": "manhattan_f1", "value": 66.0968660968661}, {"type": "manhattan_precision", "value": 72.04968944099379}, {"type": "manhattan_recall", "value": 61.05263157894737}, {"type": "max_accuracy", "value": 88.5}, {"type": "max_ap", "value": 72.49745515311696}, {"type": "max_f1", "value": 66.0968660968661}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB CDSC-R", "type": "PL-MTEB/cdscr-sts", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "cos_sim_pearson", "value": 90.32269765590145}, {"type": "cos_sim_spearman", "value": 89.73666311491672}, {"type": "euclidean_pearson", "value": 88.2933868516544}, {"type": "euclidean_spearman", "value": 89.73666311491672}, {"type": "manhattan_pearson", "value": 88.33474590219448}, {"type": "manhattan_spearman", "value": 89.8548364866583}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB DBPedia-PL", "type": "clarin-knext/dbpedia-pl", "config": "default", "split": "test", "revision": "76afe41d9af165cc40999fcaa92312b8b012064a"}, "metrics": [{"type": "map_at_1", "value": 7.632999999999999}, {"type": "map_at_10", "value": 16.426}, {"type": "map_at_100", "value": 22.651}, {"type": "map_at_1000", "value": 24.372}, {"type": "map_at_3", "value": 11.706}, {"type": "map_at_5", "value": 13.529}, {"type": "mrr_at_1", "value": 60.75000000000001}, {"type": "mrr_at_10", "value": 68.613}, {"type": "mrr_at_100", "value": 69.001}, {"type": "mrr_at_1000", "value": 69.021}, {"type": "mrr_at_3", "value": 67.0}, {"type": "mrr_at_5", "value": 67.925}, {"type": "ndcg_at_1", "value": 49.875}, {"type": "ndcg_at_10", "value": 36.978}, {"type": "ndcg_at_100", "value": 40.031}, {"type": "ndcg_at_1000", "value": 47.566}, {"type": "ndcg_at_3", "value": 41.148}, {"type": "ndcg_at_5", "value": 38.702}, {"type": "precision_at_1", "value": 60.75000000000001}, {"type": "precision_at_10", "value": 29.7}, {"type": "precision_at_100", "value": 9.278}, {"type": "precision_at_1000", "value": 2.099}, {"type": "precision_at_3", "value": 44.0}, {"type": "precision_at_5", "value": 37.6}, {"type": "recall_at_1", "value": 7.632999999999999}, {"type": "recall_at_10", "value": 22.040000000000003}, {"type": "recall_at_100", "value": 44.024}, {"type": "recall_at_1000", "value": 67.848}, {"type": "recall_at_3", "value": 13.093}, {"type": "recall_at_5", "value": 15.973}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FiQA-PL", "type": "clarin-knext/fiqa-pl", "config": "default", "split": "test", "revision": "2e535829717f8bf9dc829b7f911cc5bbd4e6608e"}, "metrics": [{"type": "map_at_1", "value": 15.473}, {"type": "map_at_10", "value": 24.579}, {"type": "map_at_100", "value": 26.387}, {"type": "map_at_1000", "value": 26.57}, {"type": "map_at_3", "value": 21.278}, {"type": "map_at_5", "value": 23.179}, {"type": "mrr_at_1", "value": 30.709999999999997}, {"type": "mrr_at_10", "value": 38.994}, {"type": "mrr_at_100", "value": 39.993}, {"type": "mrr_at_1000", "value": 40.044999999999995}, {"type": "mrr_at_3", "value": 36.342999999999996}, {"type": "mrr_at_5", "value": 37.846999999999994}, {"type": "ndcg_at_1", "value": 30.709999999999997}, {"type": "ndcg_at_10", "value": 31.608999999999998}, {"type": "ndcg_at_100", "value": 38.807}, {"type": "ndcg_at_1000", "value": 42.208}, {"type": "ndcg_at_3", "value": 28.086}, {"type": "ndcg_at_5", "value": 29.323}, {"type": "precision_at_1", "value": 30.709999999999997}, {"type": "precision_at_10", "value": 8.688}, {"type": "precision_at_100", "value": 1.608}, {"type": "precision_at_1000", "value": 0.22100000000000003}, {"type": "precision_at_3", "value": 18.724}, {"type": "precision_at_5", "value": 13.950999999999999}, {"type": "recall_at_1", "value": 15.473}, {"type": "recall_at_10", "value": 38.361000000000004}, {"type": "recall_at_100", "value": 65.2}, {"type": "recall_at_1000", "value": 85.789}, {"type": "recall_at_3", "value": 25.401}, {"type": "recall_at_5", "value": 30.875999999999998}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB HotpotQA-PL", "type": "clarin-knext/hotpotqa-pl", "config": "default", "split": "test", "revision": "a0bd479ac97b4ccb5bd6ce320c415d0bb4beb907"}, "metrics": [{"type": "map_at_1", "value": 38.096000000000004}, {"type": "map_at_10", "value": 51.44499999999999}, {"type": "map_at_100", "value": 52.325}, {"type": "map_at_1000", "value": 52.397000000000006}, {"type": "map_at_3", "value": 48.626999999999995}, {"type": "map_at_5", "value": 50.342}, {"type": "mrr_at_1", "value": 76.19200000000001}, {"type": "mrr_at_10", "value": 81.191}, {"type": "mrr_at_100", "value": 81.431}, {"type": "mrr_at_1000", "value": 81.443}, {"type": "mrr_at_3", "value": 80.30199999999999}, {"type": "mrr_at_5", "value": 80.85900000000001}, {"type": "ndcg_at_1", "value": 76.19200000000001}, {"type": "ndcg_at_10", "value": 60.9}, {"type": "ndcg_at_100", "value": 64.14699999999999}, {"type": "ndcg_at_1000", "value": 65.647}, {"type": "ndcg_at_3", "value": 56.818000000000005}, {"type": "ndcg_at_5", "value": 59.019999999999996}, {"type": "precision_at_1", "value": 76.19200000000001}, {"type": "precision_at_10", "value": 12.203}, {"type": "precision_at_100", "value": 1.478}, {"type": "precision_at_1000", "value": 0.168}, {"type": "precision_at_3", "value": 34.616}, {"type": "precision_at_5", "value": 22.515}, {"type": "recall_at_1", "value": 38.096000000000004}, {"type": "recall_at_10", "value": 61.013}, {"type": "recall_at_100", "value": 73.90299999999999}, {"type": "recall_at_1000", "value": 83.91}, {"type": "recall_at_3", "value": 51.92400000000001}, {"type": "recall_at_5", "value": 56.286}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MSMARCO-PL", "type": "clarin-knext/msmarco-pl", "config": "default", "split": "test", "revision": "8634c07806d5cce3a6138e260e59b81760a0a640"}, "metrics": [{"type": "map_at_1", "value": 1.548}, {"type": "map_at_10", "value": 11.049000000000001}, {"type": "map_at_100", "value": 28.874}, {"type": "map_at_1000", "value": 34.931}, {"type": "map_at_3", "value": 4.162}, {"type": "map_at_5", "value": 6.396}, {"type": "mrr_at_1", "value": 90.69800000000001}, {"type": "mrr_at_10", "value": 92.093}, {"type": "mrr_at_100", "value": 92.345}, {"type": "mrr_at_1000", "value": 92.345}, {"type": "mrr_at_3", "value": 91.86}, {"type": "mrr_at_5", "value": 91.86}, {"type": "ndcg_at_1", "value": 74.031}, {"type": "ndcg_at_10", "value": 63.978}, {"type": "ndcg_at_100", "value": 53.101}, {"type": "ndcg_at_1000", "value": 60.675999999999995}, {"type": "ndcg_at_3", "value": 71.421}, {"type": "ndcg_at_5", "value": 68.098}, {"type": "precision_at_1", "value": 90.69800000000001}, {"type": "precision_at_10", "value": 71.86}, {"type": "precision_at_100", "value": 31.395}, {"type": "precision_at_1000", "value": 5.981}, {"type": "precision_at_3", "value": 84.49600000000001}, {"type": "precision_at_5", "value": 79.07}, {"type": "recall_at_1", "value": 1.548}, {"type": "recall_at_10", "value": 12.149000000000001}, {"type": "recall_at_100", "value": 40.794999999999995}, {"type": "recall_at_1000", "value": 67.974}, {"type": "recall_at_3", "value": 4.244}, {"type": "recall_at_5", "value": 6.608}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (pl)", "type": "mteb/amazon_massive_intent", "config": "pl", "split": "test", "revision": "31efe3c427b0bae9c22cbb560b8f15491cc6bed7"}, "metrics": [{"type": "accuracy", "value": 73.55413584398119}, {"type": "f1", "value": 69.65610882318181}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (pl)", "type": "mteb/amazon_massive_scenario", "config": "pl", "split": "test", "revision": "7d571f92784cd94a019292a1f45445077d0ef634"}, "metrics": [{"type": "accuracy", "value": 76.37188971082716}, {"type": "f1", "value": 75.64847309941361}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NFCorpus-PL", "type": "clarin-knext/nfcorpus-pl", "config": "default", "split": "test", "revision": "9a6f9567fda928260afed2de480d79c98bf0bec0"}, "metrics": [{"type": "map_at_1", "value": 4.919}, {"type": "map_at_10", "value": 10.834000000000001}, {"type": "map_at_100", "value": 13.38}, {"type": "map_at_1000", "value": 14.581}, {"type": "map_at_3", "value": 8.198}, {"type": "map_at_5", "value": 9.428}, {"type": "mrr_at_1", "value": 41.176}, {"type": "mrr_at_10", "value": 50.083}, {"type": "mrr_at_100", "value": 50.559}, {"type": "mrr_at_1000", "value": 50.604000000000006}, {"type": "mrr_at_3", "value": 47.936}, {"type": "mrr_at_5", "value": 49.407000000000004}, {"type": "ndcg_at_1", "value": 39.628}, {"type": "ndcg_at_10", "value": 30.098000000000003}, {"type": "ndcg_at_100", "value": 27.061}, {"type": "ndcg_at_1000", "value": 35.94}, {"type": "ndcg_at_3", "value": 35.135}, {"type": "ndcg_at_5", "value": 33.335}, {"type": "precision_at_1", "value": 41.176}, {"type": "precision_at_10", "value": 22.259999999999998}, {"type": "precision_at_100", "value": 6.712}, {"type": "precision_at_1000", "value": 1.9060000000000001}, {"type": "precision_at_3", "value": 33.23}, {"type": "precision_at_5", "value": 29.04}, {"type": "recall_at_1", "value": 4.919}, {"type": "recall_at_10", "value": 14.196}, {"type": "recall_at_100", "value": 26.948}, {"type": "recall_at_1000", "value": 59.211000000000006}, {"type": "recall_at_3", "value": 9.44}, {"type": "recall_at_5", "value": 11.569}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NQ-PL", "type": "clarin-knext/nq-pl", "config": "default", "split": "test", "revision": "f171245712cf85dd4700b06bef18001578d0ca8d"}, "metrics": [{"type": "map_at_1", "value": 25.35}, {"type": "map_at_10", "value": 37.884}, {"type": "map_at_100", "value": 38.955}, {"type": "map_at_1000", "value": 39.007999999999996}, {"type": "map_at_3", "value": 34.239999999999995}, {"type": "map_at_5", "value": 36.398}, {"type": "mrr_at_1", "value": 28.737000000000002}, {"type": "mrr_at_10", "value": 39.973}, {"type": "mrr_at_100", "value": 40.844}, {"type": "mrr_at_1000", "value": 40.885}, {"type": "mrr_at_3", "value": 36.901}, {"type": "mrr_at_5", "value": 38.721}, {"type": "ndcg_at_1", "value": 28.708}, {"type": "ndcg_at_10", "value": 44.204}, {"type": "ndcg_at_100", "value": 48.978}, {"type": "ndcg_at_1000", "value": 50.33}, {"type": "ndcg_at_3", "value": 37.36}, {"type": "ndcg_at_5", "value": 40.912}, {"type": "precision_at_1", "value": 28.708}, {"type": "precision_at_10", "value": 7.367}, {"type": "precision_at_100", "value": 1.0030000000000001}, {"type": "precision_at_1000", "value": 0.11299999999999999}, {"type": "precision_at_3", "value": 17.034}, {"type": "precision_at_5", "value": 12.293999999999999}, {"type": "recall_at_1", "value": 25.35}, {"type": "recall_at_10", "value": 61.411}, {"type": "recall_at_100", "value": 82.599}, {"type": "recall_at_1000", "value": 92.903}, {"type": "recall_at_3", "value": 43.728}, {"type": "recall_at_5", "value": 51.854}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB PAC", "type": "laugustyniak/abusive-clauses-pl", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "accuracy", "value": 69.04141326382856}, {"type": "ap", "value": 77.49422763833996}, {"type": "f1", "value": 66.73472657783407}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB PPC", "type": "PL-MTEB/ppc-pairclassification", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "cos_sim_accuracy", "value": 81.0}, {"type": "cos_sim_ap", "value": 91.47194213011349}, {"type": "cos_sim_f1", "value": 84.73767885532592}, {"type": "cos_sim_precision", "value": 81.49847094801224}, {"type": "cos_sim_recall", "value": 88.24503311258279}, {"type": "dot_accuracy", "value": 81.0}, {"type": "dot_ap", "value": 91.47194213011349}, {"type": "dot_f1", "value": 84.73767885532592}, {"type": "dot_precision", "value": 81.49847094801224}, {"type": "dot_recall", "value": 88.24503311258279}, {"type": "euclidean_accuracy", "value": 81.0}, {"type": "euclidean_ap", "value": 91.47194213011349}, {"type": "euclidean_f1", "value": 84.73767885532592}, {"type": "euclidean_precision", "value": 81.49847094801224}, {"type": "euclidean_recall", "value": 88.24503311258279}, {"type": "manhattan_accuracy", "value": 81.0}, {"type": "manhattan_ap", "value": 91.46464475050571}, {"type": "manhattan_f1", "value": 84.48687350835321}, {"type": "manhattan_precision", "value": 81.31699846860643}, {"type": "manhattan_recall", "value": 87.91390728476821}, {"type": "max_accuracy", "value": 81.0}, {"type": "max_ap", "value": 91.47194213011349}, {"type": "max_f1", "value": 84.73767885532592}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB PSC", "type": "PL-MTEB/psc-pairclassification", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "cos_sim_accuracy", "value": 97.6808905380334}, {"type": "cos_sim_ap", "value": 99.27948611836348}, {"type": "cos_sim_f1", "value": 96.15975422427034}, {"type": "cos_sim_precision", "value": 96.90402476780186}, {"type": "cos_sim_recall", "value": 95.42682926829268}, {"type": "dot_accuracy", "value": 97.6808905380334}, {"type": "dot_ap", "value": 99.2794861183635}, {"type": "dot_f1", "value": 96.15975422427034}, {"type": "dot_precision", "value": 96.90402476780186}, {"type": "dot_recall", "value": 95.42682926829268}, {"type": "euclidean_accuracy", "value": 97.6808905380334}, {"type": "euclidean_ap", "value": 99.2794861183635}, {"type": "euclidean_f1", "value": 96.15975422427034}, {"type": "euclidean_precision", "value": 96.90402476780186}, {"type": "euclidean_recall", "value": 95.42682926829268}, {"type": "manhattan_accuracy", "value": 97.6808905380334}, {"type": "manhattan_ap", "value": 99.28715055268721}, {"type": "manhattan_f1", "value": 96.14791987673343}, {"type": "manhattan_precision", "value": 97.19626168224299}, {"type": "manhattan_recall", "value": 95.1219512195122}, {"type": "max_accuracy", "value": 97.6808905380334}, {"type": "max_ap", "value": 99.28715055268721}, {"type": "max_f1", "value": 96.15975422427034}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB PolEmo2.0-IN", "type": "PL-MTEB/polemo2_in", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "accuracy", "value": 86.16343490304708}, {"type": "f1", "value": 83.3442579486744}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB PolEmo2.0-OUT", "type": "PL-MTEB/polemo2_out", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "accuracy", "value": 68.40080971659918}, {"type": "f1", "value": 53.13720751142237}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB Quora-PL", "type": "clarin-knext/quora-pl", "config": "default", "split": "test", "revision": "0be27e93455051e531182b85e85e425aba12e9d4"}, "metrics": [{"type": "map_at_1", "value": 63.322}, {"type": "map_at_10", "value": 76.847}, {"type": "map_at_100", "value": 77.616}, {"type": "map_at_1000", "value": 77.644}, {"type": "map_at_3", "value": 73.624}, {"type": "map_at_5", "value": 75.603}, {"type": "mrr_at_1", "value": 72.88}, {"type": "mrr_at_10", "value": 80.376}, {"type": "mrr_at_100", "value": 80.604}, {"type": "mrr_at_1000", "value": 80.61}, {"type": "mrr_at_3", "value": 78.92}, {"type": "mrr_at_5", "value": 79.869}, {"type": "ndcg_at_1", "value": 72.89999999999999}, {"type": "ndcg_at_10", "value": 81.43}, {"type": "ndcg_at_100", "value": 83.394}, {"type": "ndcg_at_1000", "value": 83.685}, {"type": "ndcg_at_3", "value": 77.62599999999999}, {"type": "ndcg_at_5", "value": 79.656}, {"type": "precision_at_1", "value": 72.89999999999999}, {"type": "precision_at_10", "value": 12.548}, {"type": "precision_at_100", "value": 1.4869999999999999}, {"type": "precision_at_1000", "value": 0.155}, {"type": "precision_at_3", "value": 34.027}, {"type": "precision_at_5", "value": 22.654}, {"type": "recall_at_1", "value": 63.322}, {"type": "recall_at_10", "value": 90.664}, {"type": "recall_at_100", "value": 97.974}, {"type": "recall_at_1000", "value": 99.636}, {"type": "recall_at_3", "value": 80.067}, {"type": "recall_at_5", "value": 85.526}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SCIDOCS-PL", "type": "clarin-knext/scidocs-pl", "config": "default", "split": "test", "revision": "45452b03f05560207ef19149545f168e596c9337"}, "metrics": [{"type": "map_at_1", "value": 3.95}, {"type": "map_at_10", "value": 9.658999999999999}, {"type": "map_at_100", "value": 11.384}, {"type": "map_at_1000", "value": 11.677}, {"type": "map_at_3", "value": 7.055}, {"type": "map_at_5", "value": 8.244}, {"type": "mrr_at_1", "value": 19.5}, {"type": "mrr_at_10", "value": 28.777}, {"type": "mrr_at_100", "value": 29.936}, {"type": "mrr_at_1000", "value": 30.009999999999998}, {"type": "mrr_at_3", "value": 25.55}, {"type": "mrr_at_5", "value": 27.284999999999997}, {"type": "ndcg_at_1", "value": 19.5}, {"type": "ndcg_at_10", "value": 16.589000000000002}, {"type": "ndcg_at_100", "value": 23.879}, {"type": "ndcg_at_1000", "value": 29.279}, {"type": "ndcg_at_3", "value": 15.719}, {"type": "ndcg_at_5", "value": 13.572000000000001}, {"type": "precision_at_1", "value": 19.5}, {"type": "precision_at_10", "value": 8.62}, {"type": "precision_at_100", "value": 1.924}, {"type": "precision_at_1000", "value": 0.322}, {"type": "precision_at_3", "value": 14.6}, {"type": "precision_at_5", "value": 11.78}, {"type": "recall_at_1", "value": 3.95}, {"type": "recall_at_10", "value": 17.477999999999998}, {"type": "recall_at_100", "value": 38.99}, {"type": "recall_at_1000", "value": 65.417}, {"type": "recall_at_3", "value": 8.883000000000001}, {"type": "recall_at_5", "value": 11.933}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB SICK-E-PL", "type": "PL-MTEB/sicke-pl-pairclassification", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "cos_sim_accuracy", "value": 83.48960456583775}, {"type": "cos_sim_ap", "value": 76.31522115825375}, {"type": "cos_sim_f1", "value": 70.35573122529645}, {"type": "cos_sim_precision", "value": 70.9934735315446}, {"type": "cos_sim_recall", "value": 69.72934472934473}, {"type": "dot_accuracy", "value": 83.48960456583775}, {"type": "dot_ap", "value": 76.31522115825373}, {"type": "dot_f1", "value": 70.35573122529645}, {"type": "dot_precision", "value": 70.9934735315446}, {"type": "dot_recall", "value": 69.72934472934473}, {"type": "euclidean_accuracy", "value": 83.48960456583775}, {"type": "euclidean_ap", "value": 76.31522115825373}, {"type": "euclidean_f1", "value": 70.35573122529645}, {"type": "euclidean_precision", "value": 70.9934735315446}, {"type": "euclidean_recall", "value": 69.72934472934473}, {"type": "manhattan_accuracy", "value": 83.46922136159804}, {"type": "manhattan_ap", "value": 76.18474601388084}, {"type": "manhattan_f1", "value": 70.34779490856937}, {"type": "manhattan_precision", "value": 70.83032490974729}, {"type": "manhattan_recall", "value": 69.87179487179486}, {"type": "max_accuracy", "value": 83.48960456583775}, {"type": "max_ap", "value": 76.31522115825375}, {"type": "max_f1", "value": 70.35573122529645}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB SICK-R-PL", "type": "PL-MTEB/sickr-pl-sts", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "cos_sim_pearson", "value": 77.95374883876302}, {"type": "cos_sim_spearman", "value": 73.77630219171942}, {"type": "euclidean_pearson", "value": 75.81927069594934}, {"type": "euclidean_spearman", "value": 73.7763211303831}, {"type": "manhattan_pearson", "value": 76.03126859057528}, {"type": "manhattan_spearman", "value": 73.96528138013369}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (pl)", "type": "mteb/sts22-crosslingual-sts", "config": "pl", "split": "test", "revision": "eea2b4fe26a775864c896887d910b76a8098ad3f"}, "metrics": [{"type": "cos_sim_pearson", "value": 37.388282764841826}, {"type": "cos_sim_spearman", "value": 40.83477184710897}, {"type": "euclidean_pearson", "value": 26.754737044177805}, {"type": "euclidean_spearman", "value": 40.83477184710897}, {"type": "manhattan_pearson", "value": 26.760453110872458}, {"type": "manhattan_spearman", "value": 41.034477441383856}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SciFact-PL", "type": "clarin-knext/scifact-pl", "config": "default", "split": "test", "revision": "47932a35f045ef8ed01ba82bf9ff67f6e109207e"}, "metrics": [{"type": "map_at_1", "value": 49.15}, {"type": "map_at_10", "value": 61.690999999999995}, {"type": "map_at_100", "value": 62.348000000000006}, {"type": "map_at_1000", "value": 62.38}, {"type": "map_at_3", "value": 58.824}, {"type": "map_at_5", "value": 60.662000000000006}, {"type": "mrr_at_1", "value": 51.333}, {"type": "mrr_at_10", "value": 62.731}, {"type": "mrr_at_100", "value": 63.245}, {"type": "mrr_at_1000", "value": 63.275000000000006}, {"type": "mrr_at_3", "value": 60.667}, {"type": "mrr_at_5", "value": 61.93300000000001}, {"type": "ndcg_at_1", "value": 51.333}, {"type": "ndcg_at_10", "value": 67.168}, {"type": "ndcg_at_100", "value": 69.833}, {"type": "ndcg_at_1000", "value": 70.56700000000001}, {"type": "ndcg_at_3", "value": 62.40599999999999}, {"type": "ndcg_at_5", "value": 65.029}, {"type": "precision_at_1", "value": 51.333}, {"type": "precision_at_10", "value": 9.333}, {"type": "precision_at_100", "value": 1.0699999999999998}, {"type": "precision_at_1000", "value": 0.11299999999999999}, {"type": "precision_at_3", "value": 25.333}, {"type": "precision_at_5", "value": 17.067}, {"type": "recall_at_1", "value": 49.15}, {"type": "recall_at_10", "value": 82.533}, {"type": "recall_at_100", "value": 94.167}, {"type": "recall_at_1000", "value": 99.667}, {"type": "recall_at_3", "value": 69.917}, {"type": "recall_at_5", "value": 76.356}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB TRECCOVID-PL", "type": "clarin-knext/trec-covid-pl", "config": "default", "split": "test", "revision": "81bcb408f33366c2a20ac54adafad1ae7e877fdd"}, "metrics": [{"type": "map_at_1", "value": 0.261}, {"type": "map_at_10", "value": 2.1260000000000003}, {"type": "map_at_100", "value": 12.171999999999999}, {"type": "map_at_1000", "value": 26.884999999999998}, {"type": "map_at_3", "value": 0.695}, {"type": "map_at_5", "value": 1.134}, {"type": "mrr_at_1", "value": 96.0}, {"type": "mrr_at_10", "value": 96.952}, {"type": "mrr_at_100", "value": 96.952}, {"type": "mrr_at_1000", "value": 96.952}, {"type": "mrr_at_3", "value": 96.667}, {"type": "mrr_at_5", "value": 96.667}, {"type": "ndcg_at_1", "value": 92.0}, {"type": "ndcg_at_10", "value": 81.193}, {"type": "ndcg_at_100", "value": 61.129}, {"type": "ndcg_at_1000", "value": 51.157}, {"type": "ndcg_at_3", "value": 85.693}, {"type": "ndcg_at_5", "value": 84.129}, {"type": "precision_at_1", "value": 96.0}, {"type": "precision_at_10", "value": 85.39999999999999}, {"type": "precision_at_100", "value": 62.03999999999999}, {"type": "precision_at_1000", "value": 22.224}, {"type": "precision_at_3", "value": 88.0}, {"type": "precision_at_5", "value": 88.0}, {"type": "recall_at_1", "value": 0.261}, {"type": "recall_at_10", "value": 2.262}, {"type": "recall_at_100", "value": 14.981}, {"type": "recall_at_1000", "value": 46.837}, {"type": "recall_at_3", "value": 0.703}, {"type": "recall_at_5", "value": 1.172}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB AlloProfClusteringP2P", "type": "lyon-nlp/alloprof", "config": "default", "split": "test", "revision": "392ba3f5bcc8c51f578786c1fc3dae648662cb9b"}, "metrics": [{"type": "v_measure", "value": 70.55290063940157}, {"type": "v_measure", "value": 55.41500719337263}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB AlloprofReranking", "type": "lyon-nlp/mteb-fr-reranking-alloprof-s2p", "config": "default", "split": "test", "revision": "666fdacebe0291776e86f29345663dfaf80a0db9"}, "metrics": [{"type": "map", "value": 73.48697375332002}, {"type": "mrr", "value": 75.01836585523822}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB AlloprofRetrieval", "type": "lyon-nlp/alloprof", "config": "default", "split": "test", "revision": "392ba3f5bcc8c51f578786c1fc3dae648662cb9b"}, "metrics": [{"type": "map_at_1", "value": 38.454}, {"type": "map_at_10", "value": 51.605000000000004}, {"type": "map_at_100", "value": 52.653000000000006}, {"type": "map_at_1000", "value": 52.697}, {"type": "map_at_3", "value": 48.304}, {"type": "map_at_5", "value": 50.073}, {"type": "mrr_at_1", "value": 43.307}, {"type": "mrr_at_10", "value": 54.400000000000006}, {"type": "mrr_at_100", "value": 55.147999999999996}, {"type": "mrr_at_1000", "value": 55.174}, {"type": "mrr_at_3", "value": 51.77}, {"type": "mrr_at_5", "value": 53.166999999999994}, {"type": "ndcg_at_1", "value": 43.307}, {"type": "ndcg_at_10", "value": 57.891000000000005}, {"type": "ndcg_at_100", "value": 62.161}, {"type": "ndcg_at_1000", "value": 63.083}, {"type": "ndcg_at_3", "value": 51.851}, {"type": "ndcg_at_5", "value": 54.605000000000004}, {"type": "precision_at_1", "value": 43.307}, {"type": "precision_at_10", "value": 9.033}, {"type": "precision_at_100", "value": 1.172}, {"type": "precision_at_1000", "value": 0.127}, {"type": "precision_at_3", "value": 22.798}, {"type": "precision_at_5", "value": 15.492}, {"type": "recall_at_1", "value": 38.454}, {"type": "recall_at_10", "value": 74.166}, {"type": "recall_at_100", "value": 92.43599999999999}, {"type": "recall_at_1000", "value": 99.071}, {"type": "recall_at_3", "value": 58.087}, {"type": "recall_at_5", "value": 64.568}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification (fr)", "type": "mteb/amazon_reviews_multi", "config": "fr", "split": "test", "revision": "1399c76144fd37290681b995c656ef9b2e06e26d"}, "metrics": [{"type": "accuracy", "value": 53.474}, {"type": "f1", "value": 50.38275392350236}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB BSARDRetrieval", "type": "maastrichtlawtech/bsard", "config": "default", "split": "test", "revision": "5effa1b9b5fa3b0f9e12523e6e43e5f86a6e6d59"}, "metrics": [{"type": "map_at_1", "value": 2.252}, {"type": "map_at_10", "value": 4.661}, {"type": "map_at_100", "value": 5.271}, {"type": "map_at_1000", "value": 5.3629999999999995}, {"type": "map_at_3", "value": 3.604}, {"type": "map_at_5", "value": 4.3020000000000005}, {"type": "mrr_at_1", "value": 2.252}, {"type": "mrr_at_10", "value": 4.661}, {"type": "mrr_at_100", "value": 5.271}, {"type": "mrr_at_1000", "value": 5.3629999999999995}, {"type": "mrr_at_3", "value": 3.604}, {"type": "mrr_at_5", "value": 4.3020000000000005}, {"type": "ndcg_at_1", "value": 2.252}, {"type": "ndcg_at_10", "value": 6.3020000000000005}, {"type": "ndcg_at_100", "value": 10.342}, {"type": "ndcg_at_1000", "value": 13.475999999999999}, {"type": "ndcg_at_3", "value": 4.0649999999999995}, {"type": "ndcg_at_5", "value": 5.344}, {"type": "precision_at_1", "value": 2.252}, {"type": "precision_at_10", "value": 1.171}, {"type": "precision_at_100", "value": 0.333}, {"type": "precision_at_1000", "value": 0.059000000000000004}, {"type": "precision_at_3", "value": 1.802}, {"type": "precision_at_5", "value": 1.712}, {"type": "recall_at_1", "value": 2.252}, {"type": "recall_at_10", "value": 11.712}, {"type": "recall_at_100", "value": 33.333}, {"type": "recall_at_1000", "value": 59.458999999999996}, {"type": "recall_at_3", "value": 5.405}, {"type": "recall_at_5", "value": 8.559}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB HALClusteringS2S", "type": "lyon-nlp/clustering-hal-s2s", "config": "default", "split": "test", "revision": "e06ebbbb123f8144bef1a5d18796f3dec9ae2915"}, "metrics": [{"type": "v_measure", "value": 28.301882091023288}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MLSUMClusteringP2P", "type": "mlsum", "config": "default", "split": "test", "revision": "b5d54f8f3b61ae17845046286940f03c6bc79bc7"}, "metrics": [{"type": "v_measure", "value": 45.26992995191701}, {"type": "v_measure", "value": 42.773174876871145}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPDomainClassification (fr)", "type": "mteb/mtop_domain", "config": "fr", "split": "test", "revision": "d80d48c1eb48d3562165c59d59d0034df9fff0bf"}, "metrics": [{"type": "accuracy", "value": 93.47635452552458}, {"type": "f1", "value": 93.19922617577213}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPIntentClassification (fr)", "type": "mteb/mtop_intent", "config": "fr", "split": "test", "revision": "ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba"}, "metrics": [{"type": "accuracy", "value": 80.2317569683683}, {"type": "f1", "value": 56.18060418621901}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MasakhaNEWSClassification (fra)", "type": "masakhane/masakhanews", "config": "fra", "split": "test", "revision": "8ccc72e69e65f40c70e117d8b3c08306bb788b60"}, "metrics": [{"type": "accuracy", "value": 85.18957345971565}, {"type": "f1", "value": 80.829981537394}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MasakhaNEWSClusteringP2P (fra)", "type": "masakhane/masakhanews", "config": "fra", "split": "test", "revision": "8ccc72e69e65f40c70e117d8b3c08306bb788b60"}, "metrics": [{"type": "v_measure", "value": 71.04138999801822}, {"type": "v_measure", "value": 71.7056263158008}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (fr)", "type": "mteb/amazon_massive_intent", "config": "fr", "split": "test", "revision": "31efe3c427b0bae9c22cbb560b8f15491cc6bed7"}, "metrics": [{"type": "accuracy", "value": 76.65097511768661}, {"type": "f1", "value": 73.82441070598712}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (fr)", "type": "mteb/amazon_massive_scenario", "config": "fr", "split": "test", "revision": "7d571f92784cd94a019292a1f45445077d0ef634"}, "metrics": [{"type": "accuracy", "value": 79.09885675857431}, {"type": "f1", "value": 78.28407777434224}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MintakaRetrieval (fr)", "type": "jinaai/mintakaqa", "config": "fr", "split": "test", "revision": "efa78cc2f74bbcd21eff2261f9e13aebe40b814e"}, "metrics": [{"type": "map_at_1", "value": 25.307000000000002}, {"type": "map_at_10", "value": 36.723}, {"type": "map_at_100", "value": 37.713}, {"type": "map_at_1000", "value": 37.769000000000005}, {"type": "map_at_3", "value": 33.77}, {"type": "map_at_5", "value": 35.463}, {"type": "mrr_at_1", "value": 25.307000000000002}, {"type": "mrr_at_10", "value": 36.723}, {"type": "mrr_at_100", "value": 37.713}, {"type": "mrr_at_1000", "value": 37.769000000000005}, {"type": "mrr_at_3", "value": 33.77}, {"type": "mrr_at_5", "value": 35.463}, {"type": "ndcg_at_1", "value": 25.307000000000002}, {"type": "ndcg_at_10", "value": 42.559999999999995}, {"type": "ndcg_at_100", "value": 47.457}, {"type": "ndcg_at_1000", "value": 49.162}, {"type": "ndcg_at_3", "value": 36.461}, {"type": "ndcg_at_5", "value": 39.504}, {"type": "precision_at_1", "value": 25.307000000000002}, {"type": "precision_at_10", "value": 6.106}, {"type": "precision_at_100", "value": 0.8420000000000001}, {"type": "precision_at_1000", "value": 0.098}, {"type": "precision_at_3", "value": 14.741999999999999}, {"type": "precision_at_5", "value": 10.319}, {"type": "recall_at_1", "value": 25.307000000000002}, {"type": "recall_at_10", "value": 61.056999999999995}, {"type": "recall_at_100", "value": 84.152}, {"type": "recall_at_1000", "value": 98.03399999999999}, {"type": "recall_at_3", "value": 44.226}, {"type": "recall_at_5", "value": 51.597}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB OpusparcusPC (fr)", "type": "GEM/opusparcus", "config": "fr", "split": "test", "revision": "9e9b1f8ef51616073f47f306f7f47dd91663f86a"}, "metrics": [{"type": "cos_sim_accuracy", "value": 99.90069513406156}, {"type": "cos_sim_ap", "value": 100.0}, {"type": "cos_sim_f1", "value": 99.95032290114257}, {"type": "cos_sim_precision", "value": 100.0}, {"type": "cos_sim_recall", "value": 99.90069513406156}, {"type": "dot_accuracy", "value": 99.90069513406156}, {"type": "dot_ap", "value": 100.0}, {"type": "dot_f1", "value": 99.95032290114257}, {"type": "dot_precision", "value": 100.0}, {"type": "dot_recall", "value": 99.90069513406156}, {"type": "euclidean_accuracy", "value": 99.90069513406156}, {"type": "euclidean_ap", "value": 100.0}, {"type": "euclidean_f1", "value": 99.95032290114257}, {"type": "euclidean_precision", "value": 100.0}, {"type": "euclidean_recall", "value": 99.90069513406156}, {"type": "manhattan_accuracy", "value": 99.90069513406156}, {"type": "manhattan_ap", "value": 100.0}, {"type": "manhattan_f1", "value": 99.95032290114257}, {"type": "manhattan_precision", "value": 100.0}, {"type": "manhattan_recall", "value": 99.90069513406156}, {"type": "max_accuracy", "value": 99.90069513406156}, {"type": "max_ap", "value": 100.0}, {"type": "max_f1", "value": 99.95032290114257}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB PawsX (fr)", "type": "paws-x", "config": "fr", "split": "test", "revision": "8a04d940a42cd40658986fdd8e3da561533a3646"}, "metrics": [{"type": "cos_sim_accuracy", "value": 70.8}, {"type": "cos_sim_ap", "value": 73.7671529695957}, {"type": "cos_sim_f1", "value": 68.80964339527875}, {"type": "cos_sim_precision", "value": 62.95955882352941}, {"type": "cos_sim_recall", "value": 75.85825027685493}, {"type": "dot_accuracy", "value": 70.8}, {"type": "dot_ap", "value": 73.78345265366947}, {"type": "dot_f1", "value": 68.80964339527875}, {"type": "dot_precision", "value": 62.95955882352941}, {"type": "dot_recall", "value": 75.85825027685493}, {"type": "euclidean_accuracy", "value": 70.8}, {"type": "euclidean_ap", "value": 73.7671529695957}, {"type": "euclidean_f1", "value": 68.80964339527875}, {"type": "euclidean_precision", "value": 62.95955882352941}, {"type": "euclidean_recall", "value": 75.85825027685493}, {"type": "manhattan_accuracy", "value": 70.75}, {"type": "manhattan_ap", "value": 73.78996383615953}, {"type": "manhattan_f1", "value": 68.79432624113475}, {"type": "manhattan_precision", "value": 63.39869281045751}, {"type": "manhattan_recall", "value": 75.1937984496124}, {"type": "max_accuracy", "value": 70.8}, {"type": "max_ap", "value": 73.78996383615953}, {"type": "max_f1", "value": 68.80964339527875}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB SICKFr", "type": "Lajavaness/SICK-fr", "config": "default", "split": "test", "revision": "e077ab4cf4774a1e36d86d593b150422fafd8e8a"}, "metrics": [{"type": "cos_sim_pearson", "value": 84.03253762760392}, {"type": "cos_sim_spearman", "value": 79.68280105762004}, {"type": "euclidean_pearson", "value": 80.98265050044444}, {"type": "euclidean_spearman", "value": 79.68233242682867}, {"type": "manhattan_pearson", "value": 80.9678911810704}, {"type": "manhattan_spearman", "value": 79.70264097683109}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (fr)", "type": "mteb/sts22-crosslingual-sts", "config": "fr", "split": "test", "revision": "eea2b4fe26a775864c896887d910b76a8098ad3f"}, "metrics": [{"type": "cos_sim_pearson", "value": 80.56896987572884}, {"type": "cos_sim_spearman", "value": 81.84352499523287}, {"type": "euclidean_pearson", "value": 80.40831759421305}, {"type": "euclidean_spearman", "value": 81.84352499523287}, {"type": "manhattan_pearson", "value": 80.74333857561238}, {"type": "manhattan_spearman", "value": 82.41503246733892}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSBenchmarkMultilingualSTS (fr)", "type": "stsb_multi_mt", "config": "fr", "split": "test", "revision": "93d57ef91790589e3ce9c365164337a8a78b7632"}, "metrics": [{"type": "cos_sim_pearson", "value": 82.71826762276979}, {"type": "cos_sim_spearman", "value": 82.25433354916042}, {"type": "euclidean_pearson", "value": 81.87115571724316}, {"type": "euclidean_spearman", "value": 82.25322342890107}, {"type": "manhattan_pearson", "value": 82.11174867527224}, {"type": "manhattan_spearman", "value": 82.55905365203084}]}, {"task": {"type": "Summarization"}, "dataset": {"name": "MTEB SummEvalFr", "type": "lyon-nlp/summarization-summeval-fr-p2p", "config": "default", "split": "test", "revision": "b385812de6a9577b6f4d0f88c6a6e35395a94054"}, "metrics": [{"type": "cos_sim_pearson", "value": 30.659441623392887}, {"type": "cos_sim_spearman", "value": 30.501134097353315}, {"type": "dot_pearson", "value": 30.659444768851056}, {"type": "dot_spearman", "value": 30.501134097353315}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB SyntecReranking", "type": "lyon-nlp/mteb-fr-reranking-syntec-s2p", "config": "default", "split": "test", "revision": "b205c5084a0934ce8af14338bf03feb19499c84d"}, "metrics": [{"type": "map", "value": 94.03333333333333}, {"type": "mrr", "value": 94.03333333333333}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SyntecRetrieval", "type": "lyon-nlp/mteb-fr-retrieval-syntec-s2p", "config": "default", "split": "test", "revision": "77f7e271bf4a92b24fce5119f3486b583ca016ff"}, "metrics": [{"type": "map_at_1", "value": 79.0}, {"type": "map_at_10", "value": 87.61}, {"type": "map_at_100", "value": 87.655}, {"type": "map_at_1000", "value": 87.655}, {"type": "map_at_3", "value": 87.167}, {"type": "map_at_5", "value": 87.36699999999999}, {"type": "mrr_at_1", "value": 79.0}, {"type": "mrr_at_10", "value": 87.61}, {"type": "mrr_at_100", "value": 87.655}, {"type": "mrr_at_1000", "value": 87.655}, {"type": "mrr_at_3", "value": 87.167}, {"type": "mrr_at_5", "value": 87.36699999999999}, {"type": "ndcg_at_1", "value": 79.0}, {"type": "ndcg_at_10", "value": 90.473}, {"type": "ndcg_at_100", "value": 90.694}, {"type": "ndcg_at_1000", "value": 90.694}, {"type": "ndcg_at_3", "value": 89.464}, {"type": "ndcg_at_5", "value": 89.851}, {"type": "precision_at_1", "value": 79.0}, {"type": "precision_at_10", "value": 9.9}, {"type": "precision_at_100", "value": 1.0}, {"type": "precision_at_1000", "value": 0.1}, {"type": "precision_at_3", "value": 32.0}, {"type": "precision_at_5", "value": 19.400000000000002}, {"type": "recall_at_1", "value": 79.0}, {"type": "recall_at_10", "value": 99.0}, {"type": "recall_at_100", "value": 100.0}, {"type": "recall_at_1000", "value": 100.0}, {"type": "recall_at_3", "value": 96.0}, {"type": "recall_at_5", "value": 97.0}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB XPQARetrieval (fr)", "type": "jinaai/xpqa", "config": "fr", "split": "test", "revision": "c99d599f0a6ab9b85b065da6f9d94f9cf731679f"}, "metrics": [{"type": "map_at_1", "value": 39.395}, {"type": "map_at_10", "value": 59.123999999999995}, {"type": "map_at_100", "value": 60.704}, {"type": "map_at_1000", "value": 60.760000000000005}, {"type": "map_at_3", "value": 53.187}, {"type": "map_at_5", "value": 56.863}, {"type": "mrr_at_1", "value": 62.083}, {"type": "mrr_at_10", "value": 68.87299999999999}, {"type": "mrr_at_100", "value": 69.46900000000001}, {"type": "mrr_at_1000", "value": 69.48299999999999}, {"type": "mrr_at_3", "value": 66.8}, {"type": "mrr_at_5", "value": 67.928}, {"type": "ndcg_at_1", "value": 62.083}, {"type": "ndcg_at_10", "value": 65.583}, {"type": "ndcg_at_100", "value": 70.918}, {"type": "ndcg_at_1000", "value": 71.72800000000001}, {"type": "ndcg_at_3", "value": 60.428000000000004}, {"type": "ndcg_at_5", "value": 61.853}, {"type": "precision_at_1", "value": 62.083}, {"type": "precision_at_10", "value": 15.033}, {"type": "precision_at_100", "value": 1.9529999999999998}, {"type": "precision_at_1000", "value": 0.207}, {"type": "precision_at_3", "value": 36.315}, {"type": "precision_at_5", "value": 25.955000000000002}, {"type": "recall_at_1", "value": 39.395}, {"type": "recall_at_10", "value": 74.332}, {"type": "recall_at_100", "value": 94.729}, {"type": "recall_at_1000", "value": 99.75500000000001}, {"type": "recall_at_3", "value": 57.679}, {"type": "recall_at_5", "value": 65.036}]}]}]} |
model-attribution-challenge/bloom-2b5 | model-attribution-challenge | text-generation | [
"transformers",
"pytorch",
"bloom",
"feature-extraction",
"text-generation",
"ak",
"ar",
"as",
"bm",
"bn",
"ca",
"code",
"en",
"es",
"eu",
"fon",
"fr",
"gu",
"hi",
"id",
"ig",
"ki",
"kn",
"lg",
"ln",
"ml",
"mr",
"ne",
"nso",
"ny",
"or",
"pa",
"pt",
"rn",
"rw",
"sn",
"st",
"sw",
"ta",
"te",
"tn",
"ts",
"tum",
"tw",
"ur",
"vi",
"wo",
"xh",
"yo",
"zh",
"zhs",
"zht",
"zu",
"arxiv:1909.08053",
"arxiv:2110.02861",
"arxiv:2108.12409",
"license:bigscience-bloom-rail-1.0",
"model-index",
"text-generation-inference",
"endpoints_compatible",
"region:us"
]
| 2022-08-09T19:38:50 | 2022-09-27T15:58:41 | 31 | 0 | ---
language:
- ak
- ar
- as
- bm
- bn
- ca
- code
- en
- es
- eu
- fon
- fr
- gu
- hi
- id
- ig
- ki
- kn
- lg
- ln
- ml
- mr
- ne
- nso
- ny
- or
- pa
- pt
- rn
- rw
- sn
- st
- sw
- ta
- te
- tn
- ts
- tum
- tw
- ur
- vi
- wo
- xh
- yo
- zh
- zhs
- zht
- zu
license: bigscience-bloom-rail-1.0
pipeline_tag: text-generation
model-index:
- name: bloom
results:
- task:
type: text-generation
name: text generation
dataset:
name: arc_challenge
type: arc_challenge
metrics:
- type: acc
value: 0.27986348122866894
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: arc_easy
type: arc_easy
metrics:
- type: acc
value: 0.5946969696969697
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: axb
type: axb
metrics:
- type: acc
value: 0.4433876811594203
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: axg
type: axg
metrics:
- type: acc
value: 0.5
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: boolq
type: boolq
metrics:
- type: acc
value: 0.6165137614678899
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: cb
type: cb
metrics:
- type: acc
value: 0.30357142857142855
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: cola
type: cola
metrics:
- type: acc
value: 0.610738255033557
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: copa
type: copa
metrics:
- type: acc
value: 0.63
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: crows_pairs_english
type: crows_pairs_english
metrics:
- type: acc
value: 0.4973166368515206
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: crows_pairs_french
type: crows_pairs_french
metrics:
- type: acc
value: 0.5032796660703638
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: diabla
type: diabla
metrics:
- type: acc
value: 0.28888308977035493
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_afr
type: gsarti/flores_101_afr
metrics:
- type: byte_perplexity
value: 6.500798737976343
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_amh
type: gsarti/flores_101_amh
metrics:
- type: byte_perplexity
value: 3.9726863338897145
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ara
type: gsarti/flores_101_ara
metrics:
- type: byte_perplexity
value: 1.8083841089875814
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_asm
type: gsarti/flores_101_asm
metrics:
- type: byte_perplexity
value: 5.699102962086425
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ast
type: gsarti/flores_101_ast
metrics:
- type: byte_perplexity
value: 3.9252047073429384
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_azj
type: gsarti/flores_101_azj
metrics:
- type: byte_perplexity
value: 6.942805054270002
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_bel
type: gsarti/flores_101_bel
metrics:
- type: byte_perplexity
value: 3.614136245847082
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ben
type: gsarti/flores_101_ben
metrics:
- type: byte_perplexity
value: 5.121491534300969
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_bos
type: gsarti/flores_101_bos
metrics:
- type: byte_perplexity
value: 5.653353469118798
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_bul
type: gsarti/flores_101_bul
metrics:
- type: byte_perplexity
value: 2.7014693938055068
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_cat
type: gsarti/flores_101_cat
metrics:
- type: byte_perplexity
value: 2.305190041967345
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ceb
type: gsarti/flores_101_ceb
metrics:
- type: byte_perplexity
value: 6.291000321323428
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ces
type: gsarti/flores_101_ces
metrics:
- type: byte_perplexity
value: 5.447322753586386
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ckb
type: gsarti/flores_101_ckb
metrics:
- type: byte_perplexity
value: 3.7255124939234765
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_cym
type: gsarti/flores_101_cym
metrics:
- type: byte_perplexity
value: 12.539424151448149
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_dan
type: gsarti/flores_101_dan
metrics:
- type: byte_perplexity
value: 5.183309001005672
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_deu
type: gsarti/flores_101_deu
metrics:
- type: byte_perplexity
value: 3.1180422286591347
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ell
type: gsarti/flores_101_ell
metrics:
- type: byte_perplexity
value: 2.467943456164706
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_eng
type: gsarti/flores_101_eng
metrics:
- type: byte_perplexity
value: 2.018740628193298
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_est
type: gsarti/flores_101_est
metrics:
- type: byte_perplexity
value: 9.11654425176368
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_fas
type: gsarti/flores_101_fas
metrics:
- type: byte_perplexity
value: 3.058009097116482
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_fin
type: gsarti/flores_101_fin
metrics:
- type: byte_perplexity
value: 6.847047959628553
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_fra
type: gsarti/flores_101_fra
metrics:
- type: byte_perplexity
value: 1.9975177011840075
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ful
type: gsarti/flores_101_ful
metrics:
- type: byte_perplexity
value: 11.465912731488828
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_gle
type: gsarti/flores_101_gle
metrics:
- type: byte_perplexity
value: 8.681491663539422
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_glg
type: gsarti/flores_101_glg
metrics:
- type: byte_perplexity
value: 3.029991089015508
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_guj
type: gsarti/flores_101_guj
metrics:
- type: byte_perplexity
value: 4.955224230286231
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_hau
type: gsarti/flores_101_hau
metrics:
- type: byte_perplexity
value: 10.758347356372159
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_heb
type: gsarti/flores_101_heb
metrics:
- type: byte_perplexity
value: 3.6004478129801667
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_hin
type: gsarti/flores_101_hin
metrics:
- type: byte_perplexity
value: 4.712530650588064
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_hrv
type: gsarti/flores_101_hrv
metrics:
- type: byte_perplexity
value: 5.822418943372185
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_hun
type: gsarti/flores_101_hun
metrics:
- type: byte_perplexity
value: 6.440482646965992
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_hye
type: gsarti/flores_101_hye
metrics:
- type: byte_perplexity
value: 3.657718918347166
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ibo
type: gsarti/flores_101_ibo
metrics:
- type: byte_perplexity
value: 5.564814003872672
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ind
type: gsarti/flores_101_ind
metrics:
- type: byte_perplexity
value: 2.1597101468869373
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_isl
type: gsarti/flores_101_isl
metrics:
- type: byte_perplexity
value: 8.082349269518136
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ita
type: gsarti/flores_101_ita
metrics:
- type: byte_perplexity
value: 2.9687591414176207
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_jav
type: gsarti/flores_101_jav
metrics:
- type: byte_perplexity
value: 7.0573805415708994
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_jpn
type: gsarti/flores_101_jpn
metrics:
- type: byte_perplexity
value: 2.7758864197116933
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_kam
type: gsarti/flores_101_kam
metrics:
- type: byte_perplexity
value: 11.072949642861332
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_kan
type: gsarti/flores_101_kan
metrics:
- type: byte_perplexity
value: 5.551730651007082
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_kat
type: gsarti/flores_101_kat
metrics:
- type: byte_perplexity
value: 2.522630524283745
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_kaz
type: gsarti/flores_101_kaz
metrics:
- type: byte_perplexity
value: 3.3901748516975574
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_kea
type: gsarti/flores_101_kea
metrics:
- type: byte_perplexity
value: 8.918534182590863
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_kir
type: gsarti/flores_101_kir
metrics:
- type: byte_perplexity
value: 3.729278369847201
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_kor
type: gsarti/flores_101_kor
metrics:
- type: byte_perplexity
value: 3.932884847226212
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_lao
type: gsarti/flores_101_lao
metrics:
- type: byte_perplexity
value: 2.9077314760849924
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_lav
type: gsarti/flores_101_lav
metrics:
- type: byte_perplexity
value: 7.777221919194806
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_lin
type: gsarti/flores_101_lin
metrics:
- type: byte_perplexity
value: 7.524842908050988
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_lit
type: gsarti/flores_101_lit
metrics:
- type: byte_perplexity
value: 7.369179434621725
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ltz
type: gsarti/flores_101_ltz
metrics:
- type: byte_perplexity
value: 8.801059747949214
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_lug
type: gsarti/flores_101_lug
metrics:
- type: byte_perplexity
value: 8.483203026364786
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_luo
type: gsarti/flores_101_luo
metrics:
- type: byte_perplexity
value: 11.975963093623681
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_mal
type: gsarti/flores_101_mal
metrics:
- type: byte_perplexity
value: 4.615948455160037
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_mar
type: gsarti/flores_101_mar
metrics:
- type: byte_perplexity
value: 5.483253482821379
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_mkd
type: gsarti/flores_101_mkd
metrics:
- type: byte_perplexity
value: 2.9656732291754087
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_mlt
type: gsarti/flores_101_mlt
metrics:
- type: byte_perplexity
value: 15.004773437665275
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_mon
type: gsarti/flores_101_mon
metrics:
- type: byte_perplexity
value: 3.410598542315402
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_mri
type: gsarti/flores_101_mri
metrics:
- type: byte_perplexity
value: 7.474035895661322
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_msa
type: gsarti/flores_101_msa
metrics:
- type: byte_perplexity
value: 2.5710001772665634
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_mya
type: gsarti/flores_101_mya
metrics:
- type: byte_perplexity
value: 2.413577969878331
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_nld
type: gsarti/flores_101_nld
metrics:
- type: byte_perplexity
value: 4.127831721885065
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_nob
type: gsarti/flores_101_nob
metrics:
- type: byte_perplexity
value: 5.402763169129877
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_npi
type: gsarti/flores_101_npi
metrics:
- type: byte_perplexity
value: 5.199342701937889
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_nso
type: gsarti/flores_101_nso
metrics:
- type: byte_perplexity
value: 8.154626800955667
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_nya
type: gsarti/flores_101_nya
metrics:
- type: byte_perplexity
value: 8.179860208369393
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_oci
type: gsarti/flores_101_oci
metrics:
- type: byte_perplexity
value: 4.8617357393685845
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_orm
type: gsarti/flores_101_orm
metrics:
- type: byte_perplexity
value: 12.911595421079408
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ory
type: gsarti/flores_101_ory
metrics:
- type: byte_perplexity
value: 5.189421861225964
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_pan
type: gsarti/flores_101_pan
metrics:
- type: byte_perplexity
value: 4.698477289331806
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_pol
type: gsarti/flores_101_pol
metrics:
- type: byte_perplexity
value: 4.625550458479643
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_por
type: gsarti/flores_101_por
metrics:
- type: byte_perplexity
value: 1.9754515986213523
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_pus
type: gsarti/flores_101_pus
metrics:
- type: byte_perplexity
value: 4.4963371422771585
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ron
type: gsarti/flores_101_ron
metrics:
- type: byte_perplexity
value: 4.965456830031304
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_rus
type: gsarti/flores_101_rus
metrics:
- type: byte_perplexity
value: 2.0498020542445303
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_slk
type: gsarti/flores_101_slk
metrics:
- type: byte_perplexity
value: 6.450822127057479
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_slv
type: gsarti/flores_101_slv
metrics:
- type: byte_perplexity
value: 6.620252120186232
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_sna
type: gsarti/flores_101_sna
metrics:
- type: byte_perplexity
value: 8.462166771382726
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_snd
type: gsarti/flores_101_snd
metrics:
- type: byte_perplexity
value: 5.466066951221973
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_som
type: gsarti/flores_101_som
metrics:
- type: byte_perplexity
value: 11.95918054093392
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_spa
type: gsarti/flores_101_spa
metrics:
- type: byte_perplexity
value: 1.8965140104323535
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_srp
type: gsarti/flores_101_srp
metrics:
- type: byte_perplexity
value: 2.871214785885079
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_swe
type: gsarti/flores_101_swe
metrics:
- type: byte_perplexity
value: 5.054972008155866
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_swh
type: gsarti/flores_101_swh
metrics:
- type: byte_perplexity
value: 3.6973091886730676
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_tam
type: gsarti/flores_101_tam
metrics:
- type: byte_perplexity
value: 4.539493400469833
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_tel
type: gsarti/flores_101_tel
metrics:
- type: byte_perplexity
value: 5.807499987508966
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_tgk
type: gsarti/flores_101_tgk
metrics:
- type: byte_perplexity
value: 3.5994818827380426
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_tgl
type: gsarti/flores_101_tgl
metrics:
- type: byte_perplexity
value: 5.667053833119858
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_tha
type: gsarti/flores_101_tha
metrics:
- type: byte_perplexity
value: 2.365940201944242
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_tur
type: gsarti/flores_101_tur
metrics:
- type: byte_perplexity
value: 4.885014749844601
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ukr
type: gsarti/flores_101_ukr
metrics:
- type: byte_perplexity
value: 2.7240934990288483
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_umb
type: gsarti/flores_101_umb
metrics:
- type: byte_perplexity
value: 12.766915508610673
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_urd
type: gsarti/flores_101_urd
metrics:
- type: byte_perplexity
value: 1.9797467071381232
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_uzb
type: gsarti/flores_101_uzb
metrics:
- type: byte_perplexity
value: 12.002337637722146
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_vie
type: gsarti/flores_101_vie
metrics:
- type: byte_perplexity
value: 1.76578415476397
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_wol
type: gsarti/flores_101_wol
metrics:
- type: byte_perplexity
value: 9.144285650306488
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_xho
type: gsarti/flores_101_xho
metrics:
- type: byte_perplexity
value: 7.403240538286952
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_yor
type: gsarti/flores_101_yor
metrics:
- type: byte_perplexity
value: 5.91272037551173
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_zho_simpl
type: gsarti/flores_101_zho_simpl
metrics:
- type: byte_perplexity
value: 2.2769070822768533
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_zho_trad
type: gsarti/flores_101_zho_trad
metrics:
- type: byte_perplexity
value: 2.5180582198242383
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_zul
type: gsarti/flores_101_zul
metrics:
- type: byte_perplexity
value: 8.53353320693145
name: byte_perplexity
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: headqa
type: headqa
metrics:
- type: acc
value: 0.26440554339897887
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: hellaswag
type: hellaswag
metrics:
- type: acc
value: 0.41236805417247563
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: logiqa
type: logiqa
metrics:
- type: acc
value: 0.2073732718894009
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: mathqa
type: mathqa
metrics:
- type: acc
value: 0.24958123953098826
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: mc_taco
type: mc_taco
metrics:
- type: em
value: 0.11936936936936937
name: em
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: mnli
type: mnli
metrics:
- type: acc
value: 0.35496688741721855
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: mnli_mismatched
type: mnli_mismatched
metrics:
- type: acc
value: 0.35211554109031734
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: mrpc
type: mrpc
metrics:
- type: acc
value: 0.5857843137254902
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: multirc
type: multirc
metrics:
- type: acc
value: 0.5375412541254125
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: openbookqa
type: openbookqa
metrics:
- type: acc
value: 0.216
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: piqa
type: piqa
metrics:
- type: acc
value: 0.7078346028291621
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: prost
type: prost
metrics:
- type: acc
value: 0.22683603757472245
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: pubmedqa
type: pubmedqa
metrics:
- type: acc
value: 0.616
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: qnli
type: qnli
metrics:
- type: acc
value: 0.5072304594545122
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: qqp
type: qqp
metrics:
- type: acc
value: 0.3842443729903537
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: race
type: race
metrics:
- type: acc
value: 0.3521531100478469
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: rte
type: rte
metrics:
- type: acc
value: 0.47653429602888087
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: sciq
type: sciq
metrics:
- type: acc
value: 0.892
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: sst
type: sst
metrics:
- type: acc
value: 0.5177752293577982
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: triviaqa
type: triviaqa
metrics:
- type: acc
value: 0.041633518960487934
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: tydiqa_primary
type: tydiqa_primary
metrics:
- type: acc
value: 0.3011337608795236
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: webqs
type: webqs
metrics:
- type: acc
value: 0.01673228346456693
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: wic
type: wic
metrics:
- type: acc
value: 0.5015673981191222
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: winogrande
type: winogrande
metrics:
- type: acc
value: 0.5864246250986582
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: wnli
type: wnli
metrics:
- type: acc
value: 0.471830985915493
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: wsc
type: wsc
metrics:
- type: acc
value: 0.4423076923076923
name: acc
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: humaneval
type: humaneval
metrics:
- type: pass@1
value: 0.15524390243902436
name: pass@1
verified: false
- type: pass@10
value: 0.3220367632383857
name: pass@10
verified: false
- type: pass@100
value: 0.5545431515723145
name: pass@100
verified: false
---
<h1 style='text-align: center '>BLOOM LM</h1>
<h2 style='text-align: center '><em>BigScience Large Open-science Open-access Multilingual Language Model</em> </h2>
<h3 style='text-align: center '>Model Card</h3>
<img src="https://s3.amazonaws.com/moonup/production/uploads/1657124309515-5f17f0a0925b9863e28ad517.png" alt="BigScience Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
Version 1.0 / 26.May.2022
## Table of Contents
1. [Model Details](#model-details)
2. [Uses](#uses)
3. [Training Data](#training-data)
4. [Risks and Limitations](#risks-and-limitations)
5. [Evaluation](#evaluation)
6. [Recommendations](#recommendations)
7. [Glossary and Calculations](#glossary-and-calculations)
8. [More Information](#more-information)
9. [Model Card Authors](#model-card-authors)
## Model Details
### Basics
*This section provides information for anyone who wants to know about the model.*
<details>
<summary>Click to expand</summary> <br/>
**Developed by:** BigScience ([website](https://bigscience.huggingface.co))
* All collaborators are either volunteers or have an agreement with their employer. *(Further breakdown of participants forthcoming.)*
**Model Type:** Transformer-based Language Model
**Version:** 1.0.0
**Languages:** Multiple; see [training data](#training-data)
**License:** RAIL License v1.0 ([link](https://huggingface.co/spaces/bigscience/license))
**Release Date Estimate:** Monday, 11.July.2022
**Send Questions to:** [email protected]
**Cite as:** BigScience, _BigScience Language Open-science Open-access Multilingual (BLOOM) Language Model_. International, May 2021-May 2022
**Funded by:**
* The French government.
* Hugging Face ([website](https://huggingface.co)).
* Organizations of contributors. *(Further breakdown of organizations forthcoming.)*
</details>
### Technical Specifications
*This section provides information for people who work on model development.*
<details>
<summary>Click to expand</summary><br/>
Please see [the BLOOM training README](https://github.com/bigscience-workshop/bigscience/tree/master/train/tr11-176B-ml#readme) for full details on replicating training.
**Model Architecture:** Modified from Megatron-LM GPT2 (see [paper](https://arxiv.org/abs/1909.08053), [BLOOM Megatron code](https://github.com/bigscience-workshop/Megatron-DeepSpeed)):
* Decoder-only architecture
* Layer normalization applied to word embeddings layer (`StableEmbedding`; see [code](https://github.com/facebookresearch/bitsandbytes), [paper](https://arxiv.org/pdf/2110.02861.pdf))
* ALiBI positional encodings (see [paper](https://arxiv.org/pdf/2108.12409.pdf)), with GeLU activation functions
* 3,002,557,440 parameters:
* 642,252,800 embedding parameters
* 30 layers, 32 attention heads
* Hidden layers are 2560-dimensional
* Sequence length of 2048 tokens used (see [BLOOM tokenizer](https://huggingface.co/bigscience/tokenizer), [tokenizer description](#tokenization))
**Objective Function:** Cross Entropy with mean reduction (see [API documentation](https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html#torch.nn.CrossEntropyLoss)).
**Compute infrastructure:** Jean Zay Public Supercomputer, provided by the French government (see [announcement](https://www.enseignementsup-recherche.gouv.fr/fr/signature-du-marche-d-acquisition-de-l-un-des-supercalculateurs-les-plus-puissants-d-europe-46733)).
* Hardware: 384 A100 80GB GPUs (48 nodes):
* Additional 32 A100 80GB GPUs (4 nodes) in reserve
* 8 GPUs per node Using NVLink 4 inter-gpu connects, 4 OmniPath links
* CPU: AMD
* CPU memory: 512GB per node
* GPU memory: 640GB per node
* Inter-node connect: Omni-Path Architecture (OPA)
* NCCL-communications network: a fully dedicated subnet
* Disc IO network: shared network with other types of nodes
* Software:
* Megatron-DeepSpeed ([Github link](https://github.com/bigscience-workshop/Megatron-DeepSpeed))
* DeepSpeed ([Github link](https://github.com/microsoft/DeepSpeed))
* PyTorch (pytorch-1.11 w/ CUDA-11.5; see [Github link](https://github.com/pytorch/pytorch))
* apex ([Github link](https://github.com/NVIDIA/apex))
#### **Training**
Training logs: [Tensorboard link](https://huggingface.co/tensorboard/bigscience/tr11c-2B5-logs)
- Number of epochs: 1 (*current target*)
- Dates:
- Started 11th March, 2022 11:42am PST
- Ended 5th July, 2022
- Estimated cost of training: Equivalent of $2-5M in cloud computing (including preliminary experiments)
- Server training location: Île-de-France, France
#### **Tokenization**
The BLOOM tokenizer ([link](https://huggingface.co/bigscience/tokenizer)) is a learned subword tokenizer trained using:
- A byte-level Byte Pair Encoding (BPE) algorithm
- A simple pre-tokenization rule, no normalization
- A vocabulary size of 250,680
It was trained on a subset of a preliminary version of the corpus using alpha-weighting per language.
</details>
### Environmental Impact
<details>
<summary>Click to expand</summary><br/>
The training supercomputer, Jean Zay ([website](http://www.idris.fr/eng/jean-zay/jean-zay-presentation-eng.html)), uses mostly nuclear energy. The heat generated by it is reused for heating campus housing.
**Estimated carbon emissions:** *(Forthcoming upon completion of training.)*
**Estimated electricity usage:** *(Forthcoming upon completion of training.)*
</details>
<p> </p>
## Uses
*This section addresses questions around how the model is intended to be used, discusses the foreseeable users of the model (including those affected by the model), and describes uses that are considered out of scope or misuse of the model.
It provides information for anyone considering using the model or who is affected by the model.*
<details>
<summary>Click to expand</summary><br/>
### Intended Use
This model is being created in order to enable public research on large language models (LLMs). LLMs are intended to be used for language generation or as a pretrained base model that can be further fine-tuned for specific tasks. Use cases below are not exhaustive.
#### **Direct Use**
- Text generation
- Exploring characteristics of language generated by a language model
- Examples: Cloze tests, counterfactuals, generations with reframings
#### **Downstream Use**
- Tasks that leverage language models include: Information Extraction, Question Answering, Summarization
### Misuse and Out-of-scope Use
*This section addresses what users ought not do with the model.*
See the [BLOOM License](https://huggingface.co/spaces/bigscience/license), Attachment A, for detailed usage restrictions. The below list is non-exhaustive, but lists some easily foreseeable problematic use cases.
#### **Out-of-scope Uses**
Using the model in [high-stakes](#high-stakes) settings is out of scope for this model. The model is not designed for [critical decisions](#critical-decisions) nor uses with any material consequences on an individual's livelihood or wellbeing. The model outputs content that appears factual but is not correct.
##### Out-of-scope Uses Include:
- Usage in biomedical domains, political and legal domains, or finance domains
- Usage for evaluating or scoring individuals, such as for employment, education, or credit
- Applying the model for critical automatic decisions, generating factual content, creating reliable summaries, or generating predictions that must be correct
#### **Misuse**
Intentionally using the model for harm, violating [human rights](#human-rights), or other kinds of malicious activities, is a misuse of this model. This includes:
- Spam generation
- Disinformation and influence operations
- Disparagement and defamation
- Harassment and abuse
- [Deception](#deception)
- Unconsented impersonation and imitation
- Unconsented surveillance
- Generating content without attribution to the model, as specified in the [RAIL License, Use Restrictions](https://huggingface.co/spaces/bigscience/license)
### Intended Users
#### **Direct Users**
- General Public
- Researchers
- Students
- Educators
- Engineers/developers
- Non-commercial entities
- Community advocates, including human and civil rights groups
#### Indirect Users
- Users of derivatives created by Direct Users, such as those using software with an [intended use](#intended-use)
- Users of [Derivatives of the Model, as described in the License](https://huggingface.co/spaces/bigscience/license)
#### Others Affected (Parties Prenantes)
- People and groups referred to by the LLM
- People and groups exposed to outputs of, or decisions based on, the LLM
- People and groups whose original work is included in the LLM
</details>
<p> </p>
## Training Data
*This section provides a high-level overview of the training data. It is relevant for anyone who wants to know the basics of what the model is learning.*
<details>
<summary>Click to expand</summary><br/>
Details for each dataset are provided in individual [Data Cards](https://huggingface.co/spaces/bigscience/BigScienceCorpus).
Training data includes:
- 45 natural languages
- 12 programming languages
- In 1.5TB of pre-processed text, converted into 350B unique tokens (see [the tokenizer section](#tokenization) for more.)
#### **Languages**
The pie chart shows the distribution of languages in training data.

The following table shows the further distribution of Niger-Congo and Indic languages in the training data.
<details>
<summary>Click to expand</summary><br/>
| Niger Congo | Percentage | | Indic | Percentage |
|----------------|------------ |------ |-----------|------------|
| Chi Tumbuka | 0.00002 | | Assamese | 0.01 |
| Kikuyu | 0.00004 | | Odia | 0.04 |
| Bambara | 0.00004 | | Gujarati | 0.04 |
| Akan | 0.00007 | | Marathi | 0.05 |
| Xitsonga | 0.00007 | | Punjabi | 0.05 |
| Sesotho | 0.00007 | | Kannada | 0.06 |
| Chi Chewa | 0.0001 | | Nepali | 0.07 |
| Setswana | 0.0002 | | Telugu | 0.09 |
| Northern Sotho | 0.0002 | | Malayalam | 0.10 |
| Fon | 0.0002 | | Urdu | 0.10 |
| Kirundi | 0.0003 | | Tamil | 0.20 |
| Wolof | 0.0004 | | Bengali | 0.50 |
| Kuganda | 0.0004 | | Hindi | 0.70 |
| Chi Shona | 0.001 |
| Isi Zulu | 0.001 |
| Igbo | 0.001 |
| Xhosa | 0.001 |
| Kinyarwanda | 0.003 |
| Yoruba | 0.006 |
| Swahili | 0.02 |
</details>
The following table shows the distribution of programming languages.
<details>
<summary>Click to expand</summary><br/>
| Extension | Language | Number of files |
|----------------|------------|-----------------|
| java | Java | 5,407,724 |
| php | PHP | 4,942,186 |
| cpp | C++ | 2,503,930 |
| py | Python | 2,435,072 |
| js | JavaScript | 1,905,518 |
| cs | C# | 1,577,347 |
| rb | Ruby | 6,78,413 |
| cc | C++ | 443,054 |
| hpp | C++ | 391,048 |
| lua | Lua | 352,317 |
| go | GO | 227,763 |
| ts | TypeScript | 195,254 |
| C | C | 134,537 |
| scala | Scala | 92,052 |
| hh | C++ | 67,161 |
| H | C++ | 55,899 |
| tsx | TypeScript | 33,107 |
| rs | Rust | 29,693 |
| phpt | PHP | 9,702 |
| c++ | C++ | 1,342 |
| h++ | C++ | 791 |
| php3 | PHP | 540 |
| phps | PHP | 270 |
| php5 | PHP | 166 |
| php4 | PHP | 29 |
</details>
</details>
<p> </p>
## Risks and Limitations
*This section identifies foreseeable harms and misunderstandings.*
<details>
<summary>Click to expand</summary><br/>
Model may:
- Overrepresent some viewpoints and underrepresent others
- Contain stereotypes
- Contain [personal information](#personal-data-and-information)
- Generate:
- Hateful, abusive, or violent language
- Discriminatory or prejudicial language
- Content that may not be appropriate for all settings, including sexual content
- Make errors, including producing incorrect information as if it were factual
- Generate irrelevant or repetitive outputs
</details>
<p> </p>
## Evaluation
*This section describes the evaluation protocols and provides the results.*
<details>
<summary>Click to expand</summary><br/>
### Metrics
*This section describes the different ways performance is calculated and why.*
Includes:
| Metric | Why chosen |
|--------------------|--------------------------------------------------------------------|
| [Perplexity](#perplexity) | Standard metric for quantifying model improvements during training |
| Cross Entropy [Loss](#loss) | Standard objective for language models. |
And multiple different metrics for specific tasks. _(More evaluation metrics forthcoming upon completion of evaluation protocol.)_
### Factors
*This section lists some different aspects of BLOOM models. Its focus is on aspects that are likely to give rise to high variance in model behavior.*
- Language, such as English or Yoruba
- Domain, such as newswire or stories
- Demographic characteristics, such as gender or nationality
### Results
*Results are based on the [Factors](#factors) and [Metrics](#metrics).*
**Zero-shot evaluations:**
See this repository for JSON files: https://github.com/bigscience-workshop/evaluation-results
| Task | Language | Metric | BLOOM-2B5 |
|:----|:----|:----|:----:|
| arc_challenge | eng | acc ↑ | 0.28 |
| arc_easy | eng | acc ↑ | 0.595 |
| axb (Median of 10 prompts) | eng | acc ↑ | 0.443 |
| axg (Median of 10 prompts) | eng | acc ↑ | 0.5 |
| boolq (Median of 11 prompts) | eng | acc ↑ | 0.617 |
| cb (Median of 15 prompts) | eng | acc ↑ | 0.304 |
| cola (Median of 5 prompts) | eng | acc ↑ | 0.611 |
| copa (Median of 9 prompts) | eng | acc ↑ | 0.63 |
| crows_pairs_english (Median of 6 prompts) | eng | acc ↑ | 0.497 |
| crows_pairs_french (Median of 7 prompts) | fra | acc ↑ | 0.503 |
| diabla (Median of 2 prompts) | eng | acc ↑ | 0.289 |
| gsarti/flores_101_afr | afr | byte_perplexity ↓ | 6.501 |
| gsarti/flores_101_amh | amh | byte_perplexity ↓ | 3.973 |
| gsarti/flores_101_ara | ara | byte_perplexity ↓ | 1.808 |
| gsarti/flores_101_asm | asm | byte_perplexity ↓ | 5.699 |
| gsarti/flores_101_ast | ast | byte_perplexity ↓ | 3.925 |
| gsarti/flores_101_azj | azj | byte_perplexity ↓ | 6.943 |
| gsarti/flores_101_bel | bel | byte_perplexity ↓ | 3.614 |
| gsarti/flores_101_ben | ben | byte_perplexity ↓ | 5.121 |
| gsarti/flores_101_bos | bos | byte_perplexity ↓ | 5.653 |
| gsarti/flores_101_bul | bul | byte_perplexity ↓ | 2.701 |
| gsarti/flores_101_cat | cat | byte_perplexity ↓ | 2.305 |
| gsarti/flores_101_ceb | ceb | byte_perplexity ↓ | 6.291 |
| gsarti/flores_101_ces | ces | byte_perplexity ↓ | 5.447 |
| gsarti/flores_101_ckb | ckb | byte_perplexity ↓ | 3.726 |
| gsarti/flores_101_cym | cym | byte_perplexity ↓ | 12.539 |
| gsarti/flores_101_dan | dan | byte_perplexity ↓ | 5.183 |
| gsarti/flores_101_deu | deu | byte_perplexity ↓ | 3.118 |
| gsarti/flores_101_ell | ell | byte_perplexity ↓ | 2.468 |
| gsarti/flores_101_eng | eng | byte_perplexity ↓ | 2.019 |
| gsarti/flores_101_est | est | byte_perplexity ↓ | 9.117 |
| gsarti/flores_101_fas | fas | byte_perplexity ↓ | 3.058 |
| gsarti/flores_101_fin | fin | byte_perplexity ↓ | 6.847 |
| gsarti/flores_101_fra | fra | byte_perplexity ↓ | 1.998 |
| gsarti/flores_101_ful | ful | byte_perplexity ↓ | 11.466 |
| gsarti/flores_101_gle | gle | byte_perplexity ↓ | 8.681 |
| gsarti/flores_101_glg | glg | byte_perplexity ↓ | 3.03 |
| gsarti/flores_101_guj | guj | byte_perplexity ↓ | 4.955 |
| gsarti/flores_101_hau | hau | byte_perplexity ↓ | 10.758 |
| gsarti/flores_101_heb | heb | byte_perplexity ↓ | 3.6 |
| gsarti/flores_101_hin | hin | byte_perplexity ↓ | 4.713 |
| gsarti/flores_101_hrv | hrv | byte_perplexity ↓ | 5.822 |
| gsarti/flores_101_hun | hun | byte_perplexity ↓ | 6.44 |
| gsarti/flores_101_hye | hye | byte_perplexity ↓ | 3.658 |
| gsarti/flores_101_ibo | ibo | byte_perplexity ↓ | 5.565 |
| gsarti/flores_101_ind | ind | byte_perplexity ↓ | 2.16 |
| gsarti/flores_101_isl | isl | byte_perplexity ↓ | 8.082 |
| gsarti/flores_101_ita | ita | byte_perplexity ↓ | 2.969 |
| gsarti/flores_101_jav | jav | byte_perplexity ↓ | 7.057 |
| gsarti/flores_101_jpn | jpn | byte_perplexity ↓ | 2.776 |
| gsarti/flores_101_kam | kam | byte_perplexity ↓ | 11.073 |
| gsarti/flores_101_kan | kan | byte_perplexity ↓ | 5.552 |
| gsarti/flores_101_kat | kat | byte_perplexity ↓ | 2.523 |
| gsarti/flores_101_kaz | kaz | byte_perplexity ↓ | 3.39 |
| gsarti/flores_101_kea | kea | byte_perplexity ↓ | 8.919 |
| gsarti/flores_101_kir | kir | byte_perplexity ↓ | 3.729 |
| gsarti/flores_101_kor | kor | byte_perplexity ↓ | 3.933 |
| gsarti/flores_101_lao | lao | byte_perplexity ↓ | 2.908 |
| gsarti/flores_101_lav | lav | byte_perplexity ↓ | 7.777 |
| gsarti/flores_101_lin | lin | byte_perplexity ↓ | 7.525 |
| gsarti/flores_101_lit | lit | byte_perplexity ↓ | 7.369 |
| gsarti/flores_101_ltz | ltz | byte_perplexity ↓ | 8.801 |
| gsarti/flores_101_lug | lug | byte_perplexity ↓ | 8.483 |
| gsarti/flores_101_luo | luo | byte_perplexity ↓ | 11.976 |
| gsarti/flores_101_mal | mal | byte_perplexity ↓ | 4.616 |
| gsarti/flores_101_mar | mar | byte_perplexity ↓ | 5.483 |
| gsarti/flores_101_mkd | mkd | byte_perplexity ↓ | 2.966 |
| gsarti/flores_101_mlt | mlt | byte_perplexity ↓ | 15.005 |
| gsarti/flores_101_mon | mon | byte_perplexity ↓ | 3.411 |
| gsarti/flores_101_mri | mri | byte_perplexity ↓ | 7.474 |
| gsarti/flores_101_msa | msa | byte_perplexity ↓ | 2.571 |
| gsarti/flores_101_mya | mya | byte_perplexity ↓ | 2.414 |
| gsarti/flores_101_nld | nld | byte_perplexity ↓ | 4.128 |
| gsarti/flores_101_nob | nob | byte_perplexity ↓ | 5.403 |
| gsarti/flores_101_npi | npi | byte_perplexity ↓ | 5.199 |
| gsarti/flores_101_nso | nso | byte_perplexity ↓ | 8.155 |
| gsarti/flores_101_nya | nya | byte_perplexity ↓ | 8.18 |
| gsarti/flores_101_oci | oci | byte_perplexity ↓ | 4.862 |
| gsarti/flores_101_orm | orm | byte_perplexity ↓ | 12.912 |
| gsarti/flores_101_ory | ory | byte_perplexity ↓ | 5.189 |
| gsarti/flores_101_pan | pan | byte_perplexity ↓ | 4.698 |
| gsarti/flores_101_pol | pol | byte_perplexity ↓ | 4.626 |
| gsarti/flores_101_por | por | byte_perplexity ↓ | 1.975 |
| gsarti/flores_101_pus | pus | byte_perplexity ↓ | 4.496 |
| gsarti/flores_101_ron | ron | byte_perplexity ↓ | 4.965 |
| gsarti/flores_101_rus | rus | byte_perplexity ↓ | 2.05 |
| gsarti/flores_101_slk | slk | byte_perplexity ↓ | 6.451 |
| gsarti/flores_101_slv | slv | byte_perplexity ↓ | 6.62 |
| gsarti/flores_101_sna | sna | byte_perplexity ↓ | 8.462 |
| gsarti/flores_101_snd | snd | byte_perplexity ↓ | 5.466 |
| gsarti/flores_101_som | som | byte_perplexity ↓ | 11.959 |
| gsarti/flores_101_spa | spa | byte_perplexity ↓ | 1.897 |
| gsarti/flores_101_srp | srp | byte_perplexity ↓ | 2.871 |
| gsarti/flores_101_swe | swe | byte_perplexity ↓ | 5.055 |
| gsarti/flores_101_swh | swh | byte_perplexity ↓ | 3.697 |
| gsarti/flores_101_tam | tam | byte_perplexity ↓ | 4.539 |
| gsarti/flores_101_tel | tel | byte_perplexity ↓ | 5.807 |
| gsarti/flores_101_tgk | tgk | byte_perplexity ↓ | 3.599 |
| gsarti/flores_101_tgl | tgl | byte_perplexity ↓ | 5.667 |
| gsarti/flores_101_tha | tha | byte_perplexity ↓ | 2.366 |
| gsarti/flores_101_tur | tur | byte_perplexity ↓ | 4.885 |
| gsarti/flores_101_ukr | ukr | byte_perplexity ↓ | 2.724 |
| gsarti/flores_101_umb | umb | byte_perplexity ↓ | 12.767 |
| gsarti/flores_101_urd | urd | byte_perplexity ↓ | 1.98 |
| gsarti/flores_101_uzb | uzb | byte_perplexity ↓ | 12.002 |
| gsarti/flores_101_vie | vie | byte_perplexity ↓ | 1.766 |
| gsarti/flores_101_wol | wol | byte_perplexity ↓ | 9.144 |
| gsarti/flores_101_xho | xho | byte_perplexity ↓ | 7.403 |
| gsarti/flores_101_yor | yor | byte_perplexity ↓ | 5.913 |
| gsarti/flores_101_zho_simpl | zho_simpl | byte_perplexity ↓ | 2.277 |
| gsarti/flores_101_zho_trad | zho_trad | byte_perplexity ↓ | 2.518 |
| gsarti/flores_101_zul | zul | byte_perplexity ↓ | 8.534 |
| headqa | esp | acc ↑ | 0.264 |
| hellaswag | eng | acc ↑ | 0.412 |
| logiqa | eng | acc ↑ | 0.207 |
| mathqa | eng | acc ↑ | 0.25 |
| mc_taco | eng | em ↑ | 0.119 |
| mnli (Median of 15 prompts) | eng | acc ↑ | 0.355 |
| mnli_mismatched (Median of 15 prompts) | eng | acc ↑ | 0.352 |
| mrpc | eng | acc ↑ | 0.586 |
| multirc (Median of 11 prompts) | eng | acc ↑ | 0.538 |
| openbookqa | eng | acc ↑ | 0.216 |
| piqa | eng | acc ↑ | 0.708 |
| prost | eng | acc ↑ | 0.227 |
| pubmedqa | eng | acc ↑ | 0.616 |
| qnli | eng | acc ↑ | 0.507 |
| qqp (Median of 7 prompts) | eng | acc ↑ | 0.384 |
| race | eng | acc ↑ | 0.352 |
| rte (Median of 6 prompts) | eng | acc ↑ | 0.477 |
| sciq | eng | acc ↑ | 0.892 |
| sst (Median of 6 prompts) | eng | acc ↑ | 0.518 |
| triviaqa | eng | acc ↑ | 0.042 |
| tydiqa_primary (Median of 24 prompts) | eng | acc ↑ | 0.301 |
| webqs | eng | acc ↑ | 0.017 |
| wic (Median of 11 prompts) | eng | acc ↑ | 0.502 |
| winogrande | eng | acc ↑ | 0.586 |
| wnli (Median of 6 prompts) | eng | acc ↑ | 0.472 |
| wsc (Median of 11 prompts) | eng | acc ↑ | 0.442 |
| humaneval | python | pass@1 ↑ | 0.155 |
| humaneval | python | pass@10 ↑ | 0.322 |
| humaneval | python | pass@100 ↑ | 0.555 |
**Train-time Evaluation:**
As of 25.May.2022, 15:00 PST:
- Training Loss: 2.0
- Validation Loss: 2.2
- Perplexity: 8.9
</details>
<p> </p>
## Recommendations
*This section provides information on warnings and potential mitigations.*
<details>
<summary>Click to expand</summary><br/>
- Indirect users should be made aware when the content they're working with is created by the LLM.
- Users should be aware of [Risks and Limitations](#risks-and-limitations), and include an appropriate age disclaimer or blocking interface as necessary.
- Models pretrained with the LLM should include an updated Model Card.
- Users of the model should provide mechanisms for those affected to provide feedback, such as an email address for comments.
</details>
<p> </p>
## Glossary and Calculations
*This section defines common terms and how metrics are calculated.*
<details>
<summary>Click to expand</summary><br/>
- <a name="loss">**Loss:**</a> A calculation of the difference between what the model has learned and what the data shows ("groundtruth"). The lower the loss, the better. The training process aims to minimize the loss.
- <a name="perplexity">**Perplexity:**</a> This is based on what the model estimates the probability of new data is. The lower the perplexity, the better. If the model is 100% correct at predicting the next token it will see, then the perplexity is 1. Mathematically this is calculated using entropy.
- <a name="high-stakes">**High-stakes settings:**</a> Such as those identified as "high-risk AI systems" and "unacceptable risk AI systems" in the European Union's proposed [Artificial Intelligence (AI) Act](https://artificialintelligenceact.eu/annexes/).
- <a name="critical-decisions">**Critical decisions:**</a> Such as those defined in [the United States' proposed Algorithmic Accountability Act](https://www.congress.gov/117/bills/s3572/BILLS-117s3572is.pdf).
- <a name="human-rights">**Human rights:**</a> Includes those rights defined in the [Universal Declaration of Human Rights](https://www.un.org/sites/un2.un.org/files/2021/03/udhr.pdf).
- <a name="personal-data-and-information">**Personal Data and Personal Information:**</a> Personal data and information is defined in multiple data protection regulations, such as "[personal data](https://gdpr-info.eu/issues/personal-data/)" in the [European Union's General Data Protection Regulation](https://gdpr-info.eu); and "personal information" in the Republic of South Africa's [Protection of Personal Information Act](https://www.gov.za/sites/default/files/gcis_document/201409/3706726-11act4of2013popi.pdf), The People's Republic of China's [Personal information protection law](http://en.npc.gov.cn.cdurl.cn/2021-12/29/c_694559.htm).
- <a name="sensitive-characteristics">**Sensitive characteristics:**</a> This includes specifically protected categories in human rights (see [UHDR, Article 2](https://www.un.org/sites/un2.un.org/files/2021/03/udhr.pdf)) and personal information regulation (see GDPR, [Article 9; Protection of Personal Information Act, Chapter 1](https://www.gov.za/sites/default/files/gcis_document/201409/3706726-11act4of2013popi.pdf))
- <a name="deception">**Deception:**</a> Doing something to intentionally mislead individuals to believe something that is false, such as by creating deadbots or chatbots on social media posing as real people, or generating text documents without making consumers aware that the text is machine generated.
</details>
<p> </p>
## More Information
<details>
<summary>Click to expand</summary><br/>
### Dataset Creation
Blog post detailing the design choices during the dataset creation: https://bigscience.huggingface.co/blog/building-a-tb-scale-multilingual-dataset-for-language-modeling
### Technical Specifications
Blog post summarizing how the architecture, size, shape, and pre-training duration where selected: https://bigscience.huggingface.co/blog/what-language-model-to-train-if-you-have-two-million-gpu-hours
More details on the architecture/optimizer: https://github.com/bigscience-workshop/bigscience/tree/master/train/tr11-176B-ml
Blog post on the hardware/engineering side: https://bigscience.huggingface.co/blog/which-hardware-to-train-a-176b-parameters-model
Details on the distributed setup used for the training: https://github.com/bigscience-workshop/bigscience/tree/master/train/tr11-176B-ml
Tensorboard updated during the training: https://huggingface.co/bigscience/tr11-176B-ml-logs/tensorboard#scalars&tagFilter=loss
Insights on how to approach training, negative results: https://github.com/bigscience-workshop/bigscience/blob/master/train/lessons-learned.md
Details on the obstacles overcome during the preparation on the engineering side (instabilities, optimization of training throughput, so many technical tricks and questions): https://github.com/bigscience-workshop/bigscience/blob/master/train/tr11-176B-ml/chronicles.md
### Initial Results
Initial prompting experiments using interim checkpoints: https://huggingface.co/spaces/bigscience/bloom-book
</details>
<p> </p>
## Model Card Authors
*Ordered roughly chronologically and by amount of time spent.*
Margaret Mitchell, Giada Pistilli, Yacine Jernite, Ezinwanne Ozoani, Marissa Gerchick, Nazneen Rajani, Sasha Luccioni, Irene Solaiman, Maraim Masoud, Somaieh Nikpoor, Carlos Muñoz Ferrandis, Stas Bekman, Christopher Akiki, Danish Contractor, David Lansky, Angelina McMillan-Major, Tristan Thrush, Suzana Ilić, Gérard Dupont, Shayne Longpre, Manan Dey, Stella Biderman, Douwe Kiela, Emi Baylor, Teven Le Scao, Aaron Gokaslan, Julien Launay, Niklas Muennighoff
| [
"QUESTION_ANSWERING",
"SUMMARIZATION"
]
| [
"PUBMEDQA",
"SCIQ"
]
| Non_BioNLP |
<h1 style='text-align: center '>BLOOM LM</h1>
<h2 style='text-align: center '><em>BigScience Large Open-science Open-access Multilingual Language Model</em> </h2>
<h3 style='text-align: center '>Model Card</h3>
<img src="https://s3.amazonaws.com/moonup/production/uploads/1657124309515-5f17f0a0925b9863e28ad517.png" alt="BigScience Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
Version 1.0 / 26.May.2022
## Table of Contents
1. [Model Details](#model-details)
2. [Uses](#uses)
3. [Training Data](#training-data)
4. [Risks and Limitations](#risks-and-limitations)
5. [Evaluation](#evaluation)
6. [Recommendations](#recommendations)
7. [Glossary and Calculations](#glossary-and-calculations)
8. [More Information](#more-information)
9. [Model Card Authors](#model-card-authors)
## Model Details
### Basics
*This section provides information for anyone who wants to know about the model.*
<details>
<summary>Click to expand</summary> <br/>
**Developed by:** BigScience ([website](https://bigscience.huggingface.co))
* All collaborators are either volunteers or have an agreement with their employer. *(Further breakdown of participants forthcoming.)*
**Model Type:** Transformer-based Language Model
**Version:** 1.0.0
**Languages:** Multiple; see [training data](#training-data)
**License:** RAIL License v1.0 ([link](https://huggingface.co/spaces/bigscience/license))
**Release Date Estimate:** Monday, 11.July.2022
**Send Questions to:** [email protected]
**Cite as:** BigScience, _BigScience Language Open-science Open-access Multilingual (BLOOM) Language Model_. International, May 2021-May 2022
**Funded by:**
* The French government.
* Hugging Face ([website](https://huggingface.co)).
* Organizations of contributors. *(Further breakdown of organizations forthcoming.)*
</details>
### Technical Specifications
*This section provides information for people who work on model development.*
<details>
<summary>Click to expand</summary><br/>
Please see [the BLOOM training README](https://github.com/bigscience-workshop/bigscience/tree/master/train/tr11-176B-ml#readme) for full details on replicating training.
**Model Architecture:** Modified from Megatron-LM GPT2 (see [paper](https://arxiv.org/abs/1909.08053), [BLOOM Megatron code](https://github.com/bigscience-workshop/Megatron-DeepSpeed)):
* Decoder-only architecture
* Layer normalization applied to word embeddings layer (`StableEmbedding`; see [code](https://github.com/facebookresearch/bitsandbytes), [paper](https://arxiv.org/pdf/2110.02861.pdf))
* ALiBI positional encodings (see [paper](https://arxiv.org/pdf/2108.12409.pdf)), with GeLU activation functions
* 3,002,557,440 parameters:
* 642,252,800 embedding parameters
* 30 layers, 32 attention heads
* Hidden layers are 2560-dimensional
* Sequence length of 2048 tokens used (see [BLOOM tokenizer](https://huggingface.co/bigscience/tokenizer), [tokenizer description](#tokenization))
**Objective Function:** Cross Entropy with mean reduction (see [API documentation](https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html#torch.nn.CrossEntropyLoss)).
**Compute infrastructure:** Jean Zay Public Supercomputer, provided by the French government (see [announcement](https://www.enseignementsup-recherche.gouv.fr/fr/signature-du-marche-d-acquisition-de-l-un-des-supercalculateurs-les-plus-puissants-d-europe-46733)).
* Hardware: 384 A100 80GB GPUs (48 nodes):
* Additional 32 A100 80GB GPUs (4 nodes) in reserve
* 8 GPUs per node Using NVLink 4 inter-gpu connects, 4 OmniPath links
* CPU: AMD
* CPU memory: 512GB per node
* GPU memory: 640GB per node
* Inter-node connect: Omni-Path Architecture (OPA)
* NCCL-communications network: a fully dedicated subnet
* Disc IO network: shared network with other types of nodes
* Software:
* Megatron-DeepSpeed ([Github link](https://github.com/bigscience-workshop/Megatron-DeepSpeed))
* DeepSpeed ([Github link](https://github.com/microsoft/DeepSpeed))
* PyTorch (pytorch-1.11 w/ CUDA-11.5; see [Github link](https://github.com/pytorch/pytorch))
* apex ([Github link](https://github.com/NVIDIA/apex))
#### **Training**
Training logs: [Tensorboard link](https://huggingface.co/tensorboard/bigscience/tr11c-2B5-logs)
- Number of epochs: 1 (*current target*)
- Dates:
- Started 11th March, 2022 11:42am PST
- Ended 5th July, 2022
- Estimated cost of training: Equivalent of $2-5M in cloud computing (including preliminary experiments)
- Server training location: Île-de-France, France
#### **Tokenization**
The BLOOM tokenizer ([link](https://huggingface.co/bigscience/tokenizer)) is a learned subword tokenizer trained using:
- A byte-level Byte Pair Encoding (BPE) algorithm
- A simple pre-tokenization rule, no normalization
- A vocabulary size of 250,680
It was trained on a subset of a preliminary version of the corpus using alpha-weighting per language.
</details>
### Environmental Impact
<details>
<summary>Click to expand</summary><br/>
The training supercomputer, Jean Zay ([website](http://www.idris.fr/eng/jean-zay/jean-zay-presentation-eng.html)), uses mostly nuclear energy. The heat generated by it is reused for heating campus housing.
**Estimated carbon emissions:** *(Forthcoming upon completion of training.)*
**Estimated electricity usage:** *(Forthcoming upon completion of training.)*
</details>
<p> </p>
## Uses
*This section addresses questions around how the model is intended to be used, discusses the foreseeable users of the model (including those affected by the model), and describes uses that are considered out of scope or misuse of the model.
It provides information for anyone considering using the model or who is affected by the model.*
<details>
<summary>Click to expand</summary><br/>
### Intended Use
This model is being created in order to enable public research on large language models (LLMs). LLMs are intended to be used for language generation or as a pretrained base model that can be further fine-tuned for specific tasks. Use cases below are not exhaustive.
#### **Direct Use**
- Text generation
- Exploring characteristics of language generated by a language model
- Examples: Cloze tests, counterfactuals, generations with reframings
#### **Downstream Use**
- Tasks that leverage language models include: Information Extraction, Question Answering, Summarization
### Misuse and Out-of-scope Use
*This section addresses what users ought not do with the model.*
See the [BLOOM License](https://huggingface.co/spaces/bigscience/license), Attachment A, for detailed usage restrictions. The below list is non-exhaustive, but lists some easily foreseeable problematic use cases.
#### **Out-of-scope Uses**
Using the model in [high-stakes](#high-stakes) settings is out of scope for this model. The model is not designed for [critical decisions](#critical-decisions) nor uses with any material consequences on an individual's livelihood or wellbeing. The model outputs content that appears factual but is not correct.
##### Out-of-scope Uses Include:
- Usage in biomedical domains, political and legal domains, or finance domains
- Usage for evaluating or scoring individuals, such as for employment, education, or credit
- Applying the model for critical automatic decisions, generating factual content, creating reliable summaries, or generating predictions that must be correct
#### **Misuse**
Intentionally using the model for harm, violating [human rights](#human-rights), or other kinds of malicious activities, is a misuse of this model. This includes:
- Spam generation
- Disinformation and influence operations
- Disparagement and defamation
- Harassment and abuse
- [Deception](#deception)
- Unconsented impersonation and imitation
- Unconsented surveillance
- Generating content without attribution to the model, as specified in the [RAIL License, Use Restrictions](https://huggingface.co/spaces/bigscience/license)
### Intended Users
#### **Direct Users**
- General Public
- Researchers
- Students
- Educators
- Engineers/developers
- Non-commercial entities
- Community advocates, including human and civil rights groups
#### Indirect Users
- Users of derivatives created by Direct Users, such as those using software with an [intended use](#intended-use)
- Users of [Derivatives of the Model, as described in the License](https://huggingface.co/spaces/bigscience/license)
#### Others Affected (Parties Prenantes)
- People and groups referred to by the LLM
- People and groups exposed to outputs of, or decisions based on, the LLM
- People and groups whose original work is included in the LLM
</details>
<p> </p>
## Training Data
*This section provides a high-level overview of the training data. It is relevant for anyone who wants to know the basics of what the model is learning.*
<details>
<summary>Click to expand</summary><br/>
Details for each dataset are provided in individual [Data Cards](https://huggingface.co/spaces/bigscience/BigScienceCorpus).
Training data includes:
- 45 natural languages
- 12 programming languages
- In 1.5TB of pre-processed text, converted into 350B unique tokens (see [the tokenizer section](#tokenization) for more.)
#### **Languages**
The pie chart shows the distribution of languages in training data.

The following table shows the further distribution of Niger-Congo and Indic languages in the training data.
<details>
<summary>Click to expand</summary><br/>
| Niger Congo | Percentage | | Indic | Percentage |
|----------------|------------ |------ |-----------|------------|
| Chi Tumbuka | 0.00002 | | Assamese | 0.01 |
| Kikuyu | 0.00004 | | Odia | 0.04 |
| Bambara | 0.00004 | | Gujarati | 0.04 |
| Akan | 0.00007 | | Marathi | 0.05 |
| Xitsonga | 0.00007 | | Punjabi | 0.05 |
| Sesotho | 0.00007 | | Kannada | 0.06 |
| Chi Chewa | 0.0001 | | Nepali | 0.07 |
| Setswana | 0.0002 | | Telugu | 0.09 |
| Northern Sotho | 0.0002 | | Malayalam | 0.10 |
| Fon | 0.0002 | | Urdu | 0.10 |
| Kirundi | 0.0003 | | Tamil | 0.20 |
| Wolof | 0.0004 | | Bengali | 0.50 |
| Kuganda | 0.0004 | | Hindi | 0.70 |
| Chi Shona | 0.001 |
| Isi Zulu | 0.001 |
| Igbo | 0.001 |
| Xhosa | 0.001 |
| Kinyarwanda | 0.003 |
| Yoruba | 0.006 |
| Swahili | 0.02 |
</details>
The following table shows the distribution of programming languages.
<details>
<summary>Click to expand</summary><br/>
| Extension | Language | Number of files |
|----------------|------------|-----------------|
| java | Java | 5,407,724 |
| php | PHP | 4,942,186 |
| cpp | C++ | 2,503,930 |
| py | Python | 2,435,072 |
| js | JavaScript | 1,905,518 |
| cs | C# | 1,577,347 |
| rb | Ruby | 6,78,413 |
| cc | C++ | 443,054 |
| hpp | C++ | 391,048 |
| lua | Lua | 352,317 |
| go | GO | 227,763 |
| ts | TypeScript | 195,254 |
| C | C | 134,537 |
| scala | Scala | 92,052 |
| hh | C++ | 67,161 |
| H | C++ | 55,899 |
| tsx | TypeScript | 33,107 |
| rs | Rust | 29,693 |
| phpt | PHP | 9,702 |
| c++ | C++ | 1,342 |
| h++ | C++ | 791 |
| php3 | PHP | 540 |
| phps | PHP | 270 |
| php5 | PHP | 166 |
| php4 | PHP | 29 |
</details>
</details>
<p> </p>
## Risks and Limitations
*This section identifies foreseeable harms and misunderstandings.*
<details>
<summary>Click to expand</summary><br/>
Model may:
- Overrepresent some viewpoints and underrepresent others
- Contain stereotypes
- Contain [personal information](#personal-data-and-information)
- Generate:
- Hateful, abusive, or violent language
- Discriminatory or prejudicial language
- Content that may not be appropriate for all settings, including sexual content
- Make errors, including producing incorrect information as if it were factual
- Generate irrelevant or repetitive outputs
</details>
<p> </p>
## Evaluation
*This section describes the evaluation protocols and provides the results.*
<details>
<summary>Click to expand</summary><br/>
### Metrics
*This section describes the different ways performance is calculated and why.*
Includes:
| Metric | Why chosen |
|--------------------|--------------------------------------------------------------------|
| [Perplexity](#perplexity) | Standard metric for quantifying model improvements during training |
| Cross Entropy [Loss](#loss) | Standard objective for language models. |
And multiple different metrics for specific tasks. _(More evaluation metrics forthcoming upon completion of evaluation protocol.)_
### Factors
*This section lists some different aspects of BLOOM models. Its focus is on aspects that are likely to give rise to high variance in model behavior.*
- Language, such as English or Yoruba
- Domain, such as newswire or stories
- Demographic characteristics, such as gender or nationality
### Results
*Results are based on the [Factors](#factors) and [Metrics](#metrics).*
**Zero-shot evaluations:**
See this repository for JSON files: https://github.com/bigscience-workshop/evaluation-results
| Task | Language | Metric | BLOOM-2B5 |
|:----|:----|:----|:----:|
| arc_challenge | eng | acc ↑ | 0.28 |
| arc_easy | eng | acc ↑ | 0.595 |
| axb (Median of 10 prompts) | eng | acc ↑ | 0.443 |
| axg (Median of 10 prompts) | eng | acc ↑ | 0.5 |
| boolq (Median of 11 prompts) | eng | acc ↑ | 0.617 |
| cb (Median of 15 prompts) | eng | acc ↑ | 0.304 |
| cola (Median of 5 prompts) | eng | acc ↑ | 0.611 |
| copa (Median of 9 prompts) | eng | acc ↑ | 0.63 |
| crows_pairs_english (Median of 6 prompts) | eng | acc ↑ | 0.497 |
| crows_pairs_french (Median of 7 prompts) | fra | acc ↑ | 0.503 |
| diabla (Median of 2 prompts) | eng | acc ↑ | 0.289 |
| gsarti/flores_101_afr | afr | byte_perplexity ↓ | 6.501 |
| gsarti/flores_101_amh | amh | byte_perplexity ↓ | 3.973 |
| gsarti/flores_101_ara | ara | byte_perplexity ↓ | 1.808 |
| gsarti/flores_101_asm | asm | byte_perplexity ↓ | 5.699 |
| gsarti/flores_101_ast | ast | byte_perplexity ↓ | 3.925 |
| gsarti/flores_101_azj | azj | byte_perplexity ↓ | 6.943 |
| gsarti/flores_101_bel | bel | byte_perplexity ↓ | 3.614 |
| gsarti/flores_101_ben | ben | byte_perplexity ↓ | 5.121 |
| gsarti/flores_101_bos | bos | byte_perplexity ↓ | 5.653 |
| gsarti/flores_101_bul | bul | byte_perplexity ↓ | 2.701 |
| gsarti/flores_101_cat | cat | byte_perplexity ↓ | 2.305 |
| gsarti/flores_101_ceb | ceb | byte_perplexity ↓ | 6.291 |
| gsarti/flores_101_ces | ces | byte_perplexity ↓ | 5.447 |
| gsarti/flores_101_ckb | ckb | byte_perplexity ↓ | 3.726 |
| gsarti/flores_101_cym | cym | byte_perplexity ↓ | 12.539 |
| gsarti/flores_101_dan | dan | byte_perplexity ↓ | 5.183 |
| gsarti/flores_101_deu | deu | byte_perplexity ↓ | 3.118 |
| gsarti/flores_101_ell | ell | byte_perplexity ↓ | 2.468 |
| gsarti/flores_101_eng | eng | byte_perplexity ↓ | 2.019 |
| gsarti/flores_101_est | est | byte_perplexity ↓ | 9.117 |
| gsarti/flores_101_fas | fas | byte_perplexity ↓ | 3.058 |
| gsarti/flores_101_fin | fin | byte_perplexity ↓ | 6.847 |
| gsarti/flores_101_fra | fra | byte_perplexity ↓ | 1.998 |
| gsarti/flores_101_ful | ful | byte_perplexity ↓ | 11.466 |
| gsarti/flores_101_gle | gle | byte_perplexity ↓ | 8.681 |
| gsarti/flores_101_glg | glg | byte_perplexity ↓ | 3.03 |
| gsarti/flores_101_guj | guj | byte_perplexity ↓ | 4.955 |
| gsarti/flores_101_hau | hau | byte_perplexity ↓ | 10.758 |
| gsarti/flores_101_heb | heb | byte_perplexity ↓ | 3.6 |
| gsarti/flores_101_hin | hin | byte_perplexity ↓ | 4.713 |
| gsarti/flores_101_hrv | hrv | byte_perplexity ↓ | 5.822 |
| gsarti/flores_101_hun | hun | byte_perplexity ↓ | 6.44 |
| gsarti/flores_101_hye | hye | byte_perplexity ↓ | 3.658 |
| gsarti/flores_101_ibo | ibo | byte_perplexity ↓ | 5.565 |
| gsarti/flores_101_ind | ind | byte_perplexity ↓ | 2.16 |
| gsarti/flores_101_isl | isl | byte_perplexity ↓ | 8.082 |
| gsarti/flores_101_ita | ita | byte_perplexity ↓ | 2.969 |
| gsarti/flores_101_jav | jav | byte_perplexity ↓ | 7.057 |
| gsarti/flores_101_jpn | jpn | byte_perplexity ↓ | 2.776 |
| gsarti/flores_101_kam | kam | byte_perplexity ↓ | 11.073 |
| gsarti/flores_101_kan | kan | byte_perplexity ↓ | 5.552 |
| gsarti/flores_101_kat | kat | byte_perplexity ↓ | 2.523 |
| gsarti/flores_101_kaz | kaz | byte_perplexity ↓ | 3.39 |
| gsarti/flores_101_kea | kea | byte_perplexity ↓ | 8.919 |
| gsarti/flores_101_kir | kir | byte_perplexity ↓ | 3.729 |
| gsarti/flores_101_kor | kor | byte_perplexity ↓ | 3.933 |
| gsarti/flores_101_lao | lao | byte_perplexity ↓ | 2.908 |
| gsarti/flores_101_lav | lav | byte_perplexity ↓ | 7.777 |
| gsarti/flores_101_lin | lin | byte_perplexity ↓ | 7.525 |
| gsarti/flores_101_lit | lit | byte_perplexity ↓ | 7.369 |
| gsarti/flores_101_ltz | ltz | byte_perplexity ↓ | 8.801 |
| gsarti/flores_101_lug | lug | byte_perplexity ↓ | 8.483 |
| gsarti/flores_101_luo | luo | byte_perplexity ↓ | 11.976 |
| gsarti/flores_101_mal | mal | byte_perplexity ↓ | 4.616 |
| gsarti/flores_101_mar | mar | byte_perplexity ↓ | 5.483 |
| gsarti/flores_101_mkd | mkd | byte_perplexity ↓ | 2.966 |
| gsarti/flores_101_mlt | mlt | byte_perplexity ↓ | 15.005 |
| gsarti/flores_101_mon | mon | byte_perplexity ↓ | 3.411 |
| gsarti/flores_101_mri | mri | byte_perplexity ↓ | 7.474 |
| gsarti/flores_101_msa | msa | byte_perplexity ↓ | 2.571 |
| gsarti/flores_101_mya | mya | byte_perplexity ↓ | 2.414 |
| gsarti/flores_101_nld | nld | byte_perplexity ↓ | 4.128 |
| gsarti/flores_101_nob | nob | byte_perplexity ↓ | 5.403 |
| gsarti/flores_101_npi | npi | byte_perplexity ↓ | 5.199 |
| gsarti/flores_101_nso | nso | byte_perplexity ↓ | 8.155 |
| gsarti/flores_101_nya | nya | byte_perplexity ↓ | 8.18 |
| gsarti/flores_101_oci | oci | byte_perplexity ↓ | 4.862 |
| gsarti/flores_101_orm | orm | byte_perplexity ↓ | 12.912 |
| gsarti/flores_101_ory | ory | byte_perplexity ↓ | 5.189 |
| gsarti/flores_101_pan | pan | byte_perplexity ↓ | 4.698 |
| gsarti/flores_101_pol | pol | byte_perplexity ↓ | 4.626 |
| gsarti/flores_101_por | por | byte_perplexity ↓ | 1.975 |
| gsarti/flores_101_pus | pus | byte_perplexity ↓ | 4.496 |
| gsarti/flores_101_ron | ron | byte_perplexity ↓ | 4.965 |
| gsarti/flores_101_rus | rus | byte_perplexity ↓ | 2.05 |
| gsarti/flores_101_slk | slk | byte_perplexity ↓ | 6.451 |
| gsarti/flores_101_slv | slv | byte_perplexity ↓ | 6.62 |
| gsarti/flores_101_sna | sna | byte_perplexity ↓ | 8.462 |
| gsarti/flores_101_snd | snd | byte_perplexity ↓ | 5.466 |
| gsarti/flores_101_som | som | byte_perplexity ↓ | 11.959 |
| gsarti/flores_101_spa | spa | byte_perplexity ↓ | 1.897 |
| gsarti/flores_101_srp | srp | byte_perplexity ↓ | 2.871 |
| gsarti/flores_101_swe | swe | byte_perplexity ↓ | 5.055 |
| gsarti/flores_101_swh | swh | byte_perplexity ↓ | 3.697 |
| gsarti/flores_101_tam | tam | byte_perplexity ↓ | 4.539 |
| gsarti/flores_101_tel | tel | byte_perplexity ↓ | 5.807 |
| gsarti/flores_101_tgk | tgk | byte_perplexity ↓ | 3.599 |
| gsarti/flores_101_tgl | tgl | byte_perplexity ↓ | 5.667 |
| gsarti/flores_101_tha | tha | byte_perplexity ↓ | 2.366 |
| gsarti/flores_101_tur | tur | byte_perplexity ↓ | 4.885 |
| gsarti/flores_101_ukr | ukr | byte_perplexity ↓ | 2.724 |
| gsarti/flores_101_umb | umb | byte_perplexity ↓ | 12.767 |
| gsarti/flores_101_urd | urd | byte_perplexity ↓ | 1.98 |
| gsarti/flores_101_uzb | uzb | byte_perplexity ↓ | 12.002 |
| gsarti/flores_101_vie | vie | byte_perplexity ↓ | 1.766 |
| gsarti/flores_101_wol | wol | byte_perplexity ↓ | 9.144 |
| gsarti/flores_101_xho | xho | byte_perplexity ↓ | 7.403 |
| gsarti/flores_101_yor | yor | byte_perplexity ↓ | 5.913 |
| gsarti/flores_101_zho_simpl | zho_simpl | byte_perplexity ↓ | 2.277 |
| gsarti/flores_101_zho_trad | zho_trad | byte_perplexity ↓ | 2.518 |
| gsarti/flores_101_zul | zul | byte_perplexity ↓ | 8.534 |
| headqa | esp | acc ↑ | 0.264 |
| hellaswag | eng | acc ↑ | 0.412 |
| logiqa | eng | acc ↑ | 0.207 |
| mathqa | eng | acc ↑ | 0.25 |
| mc_taco | eng | em ↑ | 0.119 |
| mnli (Median of 15 prompts) | eng | acc ↑ | 0.355 |
| mnli_mismatched (Median of 15 prompts) | eng | acc ↑ | 0.352 |
| mrpc | eng | acc ↑ | 0.586 |
| multirc (Median of 11 prompts) | eng | acc ↑ | 0.538 |
| openbookqa | eng | acc ↑ | 0.216 |
| piqa | eng | acc ↑ | 0.708 |
| prost | eng | acc ↑ | 0.227 |
| pubmedqa | eng | acc ↑ | 0.616 |
| qnli | eng | acc ↑ | 0.507 |
| qqp (Median of 7 prompts) | eng | acc ↑ | 0.384 |
| race | eng | acc ↑ | 0.352 |
| rte (Median of 6 prompts) | eng | acc ↑ | 0.477 |
| sciq | eng | acc ↑ | 0.892 |
| sst (Median of 6 prompts) | eng | acc ↑ | 0.518 |
| triviaqa | eng | acc ↑ | 0.042 |
| tydiqa_primary (Median of 24 prompts) | eng | acc ↑ | 0.301 |
| webqs | eng | acc ↑ | 0.017 |
| wic (Median of 11 prompts) | eng | acc ↑ | 0.502 |
| winogrande | eng | acc ↑ | 0.586 |
| wnli (Median of 6 prompts) | eng | acc ↑ | 0.472 |
| wsc (Median of 11 prompts) | eng | acc ↑ | 0.442 |
| humaneval | python | pass@1 ↑ | 0.155 |
| humaneval | python | pass@10 ↑ | 0.322 |
| humaneval | python | pass@100 ↑ | 0.555 |
**Train-time Evaluation:**
As of 25.May.2022, 15:00 PST:
- Training Loss: 2.0
- Validation Loss: 2.2
- Perplexity: 8.9
</details>
<p> </p>
## Recommendations
*This section provides information on warnings and potential mitigations.*
<details>
<summary>Click to expand</summary><br/>
- Indirect users should be made aware when the content they're working with is created by the LLM.
- Users should be aware of [Risks and Limitations](#risks-and-limitations), and include an appropriate age disclaimer or blocking interface as necessary.
- Models pretrained with the LLM should include an updated Model Card.
- Users of the model should provide mechanisms for those affected to provide feedback, such as an email address for comments.
</details>
<p> </p>
## Glossary and Calculations
*This section defines common terms and how metrics are calculated.*
<details>
<summary>Click to expand</summary><br/>
- <a name="loss">**Loss:**</a> A calculation of the difference between what the model has learned and what the data shows ("groundtruth"). The lower the loss, the better. The training process aims to minimize the loss.
- <a name="perplexity">**Perplexity:**</a> This is based on what the model estimates the probability of new data is. The lower the perplexity, the better. If the model is 100% correct at predicting the next token it will see, then the perplexity is 1. Mathematically this is calculated using entropy.
- <a name="high-stakes">**High-stakes settings:**</a> Such as those identified as "high-risk AI systems" and "unacceptable risk AI systems" in the European Union's proposed [Artificial Intelligence (AI) Act](https://artificialintelligenceact.eu/annexes/).
- <a name="critical-decisions">**Critical decisions:**</a> Such as those defined in [the United States' proposed Algorithmic Accountability Act](https://www.congress.gov/117/bills/s3572/BILLS-117s3572is.pdf).
- <a name="human-rights">**Human rights:**</a> Includes those rights defined in the [Universal Declaration of Human Rights](https://www.un.org/sites/un2.un.org/files/2021/03/udhr.pdf).
- <a name="personal-data-and-information">**Personal Data and Personal Information:**</a> Personal data and information is defined in multiple data protection regulations, such as "[personal data](https://gdpr-info.eu/issues/personal-data/)" in the [European Union's General Data Protection Regulation](https://gdpr-info.eu); and "personal information" in the Republic of South Africa's [Protection of Personal Information Act](https://www.gov.za/sites/default/files/gcis_document/201409/3706726-11act4of2013popi.pdf), The People's Republic of China's [Personal information protection law](http://en.npc.gov.cn.cdurl.cn/2021-12/29/c_694559.htm).
- <a name="sensitive-characteristics">**Sensitive characteristics:**</a> This includes specifically protected categories in human rights (see [UHDR, Article 2](https://www.un.org/sites/un2.un.org/files/2021/03/udhr.pdf)) and personal information regulation (see GDPR, [Article 9; Protection of Personal Information Act, Chapter 1](https://www.gov.za/sites/default/files/gcis_document/201409/3706726-11act4of2013popi.pdf))
- <a name="deception">**Deception:**</a> Doing something to intentionally mislead individuals to believe something that is false, such as by creating deadbots or chatbots on social media posing as real people, or generating text documents without making consumers aware that the text is machine generated.
</details>
<p> </p>
## More Information
<details>
<summary>Click to expand</summary><br/>
### Dataset Creation
Blog post detailing the design choices during the dataset creation: https://bigscience.huggingface.co/blog/building-a-tb-scale-multilingual-dataset-for-language-modeling
### Technical Specifications
Blog post summarizing how the architecture, size, shape, and pre-training duration where selected: https://bigscience.huggingface.co/blog/what-language-model-to-train-if-you-have-two-million-gpu-hours
More details on the architecture/optimizer: https://github.com/bigscience-workshop/bigscience/tree/master/train/tr11-176B-ml
Blog post on the hardware/engineering side: https://bigscience.huggingface.co/blog/which-hardware-to-train-a-176b-parameters-model
Details on the distributed setup used for the training: https://github.com/bigscience-workshop/bigscience/tree/master/train/tr11-176B-ml
Tensorboard updated during the training: https://huggingface.co/bigscience/tr11-176B-ml-logs/tensorboard#scalars&tagFilter=loss
Insights on how to approach training, negative results: https://github.com/bigscience-workshop/bigscience/blob/master/train/lessons-learned.md
Details on the obstacles overcome during the preparation on the engineering side (instabilities, optimization of training throughput, so many technical tricks and questions): https://github.com/bigscience-workshop/bigscience/blob/master/train/tr11-176B-ml/chronicles.md
### Initial Results
Initial prompting experiments using interim checkpoints: https://huggingface.co/spaces/bigscience/bloom-book
</details>
<p> </p>
## Model Card Authors
*Ordered roughly chronologically and by amount of time spent.*
Margaret Mitchell, Giada Pistilli, Yacine Jernite, Ezinwanne Ozoani, Marissa Gerchick, Nazneen Rajani, Sasha Luccioni, Irene Solaiman, Maraim Masoud, Somaieh Nikpoor, Carlos Muñoz Ferrandis, Stas Bekman, Christopher Akiki, Danish Contractor, David Lansky, Angelina McMillan-Major, Tristan Thrush, Suzana Ilić, Gérard Dupont, Shayne Longpre, Manan Dey, Stella Biderman, Douwe Kiela, Emi Baylor, Teven Le Scao, Aaron Gokaslan, Julien Launay, Niklas Muennighoff
| {"language": ["ak", "ar", "as", "bm", "bn", "ca", "code", "en", "es", "eu", "fon", "fr", "gu", "hi", "id", "ig", "ki", "kn", "lg", "ln", "ml", "mr", "ne", "nso", "ny", "or", "pa", "pt", "rn", "rw", "sn", "st", "sw", "ta", "te", "tn", "ts", "tum", "tw", "ur", "vi", "wo", "xh", "yo", "zh", "zhs", "zht", "zu"], "license": "bigscience-bloom-rail-1.0", "pipeline_tag": "text-generation", "model-index": [{"name": "bloom", "results": [{"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "arc_challenge", "type": "arc_challenge"}, "metrics": [{"type": "acc", "value": 0.27986348122866894, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "arc_easy", "type": "arc_easy"}, "metrics": [{"type": "acc", "value": 0.5946969696969697, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "axb", "type": "axb"}, "metrics": [{"type": "acc", "value": 0.4433876811594203, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "axg", "type": "axg"}, "metrics": [{"type": "acc", "value": 0.5, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "boolq", "type": "boolq"}, "metrics": [{"type": "acc", "value": 0.6165137614678899, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "cb", "type": "cb"}, "metrics": [{"type": "acc", "value": 0.30357142857142855, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "cola", "type": "cola"}, "metrics": [{"type": "acc", "value": 0.610738255033557, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "copa", "type": "copa"}, "metrics": [{"type": "acc", "value": 0.63, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "crows_pairs_english", "type": "crows_pairs_english"}, "metrics": [{"type": "acc", "value": 0.4973166368515206, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "crows_pairs_french", "type": "crows_pairs_french"}, "metrics": [{"type": "acc", "value": 0.5032796660703638, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "diabla", "type": "diabla"}, "metrics": [{"type": "acc", "value": 0.28888308977035493, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_afr", "type": "gsarti/flores_101_afr"}, "metrics": [{"type": "byte_perplexity", "value": 6.500798737976343, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_amh", "type": "gsarti/flores_101_amh"}, "metrics": [{"type": "byte_perplexity", "value": 3.9726863338897145, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_ara", "type": "gsarti/flores_101_ara"}, "metrics": [{"type": "byte_perplexity", "value": 1.8083841089875814, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_asm", "type": "gsarti/flores_101_asm"}, "metrics": [{"type": "byte_perplexity", "value": 5.699102962086425, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_ast", "type": "gsarti/flores_101_ast"}, "metrics": [{"type": "byte_perplexity", "value": 3.9252047073429384, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_azj", "type": "gsarti/flores_101_azj"}, "metrics": [{"type": "byte_perplexity", "value": 6.942805054270002, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_bel", "type": "gsarti/flores_101_bel"}, "metrics": [{"type": "byte_perplexity", "value": 3.614136245847082, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_ben", "type": "gsarti/flores_101_ben"}, "metrics": [{"type": "byte_perplexity", "value": 5.121491534300969, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_bos", "type": "gsarti/flores_101_bos"}, "metrics": [{"type": "byte_perplexity", "value": 5.653353469118798, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_bul", "type": "gsarti/flores_101_bul"}, "metrics": [{"type": "byte_perplexity", "value": 2.7014693938055068, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_cat", "type": "gsarti/flores_101_cat"}, "metrics": [{"type": "byte_perplexity", "value": 2.305190041967345, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_ceb", "type": "gsarti/flores_101_ceb"}, "metrics": [{"type": "byte_perplexity", "value": 6.291000321323428, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_ces", "type": "gsarti/flores_101_ces"}, "metrics": [{"type": "byte_perplexity", "value": 5.447322753586386, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_ckb", "type": "gsarti/flores_101_ckb"}, "metrics": [{"type": "byte_perplexity", "value": 3.7255124939234765, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_cym", "type": "gsarti/flores_101_cym"}, "metrics": [{"type": "byte_perplexity", "value": 12.539424151448149, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_dan", "type": "gsarti/flores_101_dan"}, "metrics": [{"type": "byte_perplexity", "value": 5.183309001005672, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_deu", "type": "gsarti/flores_101_deu"}, "metrics": [{"type": "byte_perplexity", "value": 3.1180422286591347, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_ell", "type": "gsarti/flores_101_ell"}, "metrics": [{"type": "byte_perplexity", "value": 2.467943456164706, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_eng", "type": "gsarti/flores_101_eng"}, "metrics": [{"type": "byte_perplexity", "value": 2.018740628193298, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_est", "type": "gsarti/flores_101_est"}, "metrics": [{"type": "byte_perplexity", "value": 9.11654425176368, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_fas", "type": "gsarti/flores_101_fas"}, "metrics": [{"type": "byte_perplexity", "value": 3.058009097116482, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_fin", "type": "gsarti/flores_101_fin"}, "metrics": [{"type": "byte_perplexity", "value": 6.847047959628553, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_fra", "type": "gsarti/flores_101_fra"}, "metrics": [{"type": "byte_perplexity", "value": 1.9975177011840075, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_ful", "type": "gsarti/flores_101_ful"}, "metrics": [{"type": "byte_perplexity", "value": 11.465912731488828, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_gle", "type": "gsarti/flores_101_gle"}, "metrics": [{"type": "byte_perplexity", "value": 8.681491663539422, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_glg", "type": "gsarti/flores_101_glg"}, "metrics": [{"type": "byte_perplexity", "value": 3.029991089015508, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_guj", "type": "gsarti/flores_101_guj"}, "metrics": [{"type": "byte_perplexity", "value": 4.955224230286231, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_hau", "type": "gsarti/flores_101_hau"}, "metrics": [{"type": "byte_perplexity", "value": 10.758347356372159, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_heb", "type": "gsarti/flores_101_heb"}, "metrics": [{"type": "byte_perplexity", "value": 3.6004478129801667, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_hin", "type": "gsarti/flores_101_hin"}, "metrics": [{"type": "byte_perplexity", "value": 4.712530650588064, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_hrv", "type": "gsarti/flores_101_hrv"}, "metrics": [{"type": "byte_perplexity", "value": 5.822418943372185, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_hun", "type": "gsarti/flores_101_hun"}, "metrics": [{"type": "byte_perplexity", "value": 6.440482646965992, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_hye", "type": "gsarti/flores_101_hye"}, "metrics": [{"type": "byte_perplexity", "value": 3.657718918347166, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_ibo", "type": "gsarti/flores_101_ibo"}, "metrics": [{"type": "byte_perplexity", "value": 5.564814003872672, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_ind", "type": "gsarti/flores_101_ind"}, "metrics": [{"type": "byte_perplexity", "value": 2.1597101468869373, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_isl", "type": "gsarti/flores_101_isl"}, "metrics": [{"type": "byte_perplexity", "value": 8.082349269518136, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_ita", "type": "gsarti/flores_101_ita"}, "metrics": [{"type": "byte_perplexity", "value": 2.9687591414176207, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_jav", "type": "gsarti/flores_101_jav"}, "metrics": [{"type": "byte_perplexity", "value": 7.0573805415708994, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_jpn", "type": "gsarti/flores_101_jpn"}, "metrics": [{"type": "byte_perplexity", "value": 2.7758864197116933, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_kam", "type": "gsarti/flores_101_kam"}, "metrics": [{"type": "byte_perplexity", "value": 11.072949642861332, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_kan", "type": "gsarti/flores_101_kan"}, "metrics": [{"type": "byte_perplexity", "value": 5.551730651007082, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_kat", "type": "gsarti/flores_101_kat"}, "metrics": [{"type": "byte_perplexity", "value": 2.522630524283745, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_kaz", "type": "gsarti/flores_101_kaz"}, "metrics": [{"type": "byte_perplexity", "value": 3.3901748516975574, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_kea", "type": "gsarti/flores_101_kea"}, "metrics": [{"type": "byte_perplexity", "value": 8.918534182590863, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_kir", "type": "gsarti/flores_101_kir"}, "metrics": [{"type": "byte_perplexity", "value": 3.729278369847201, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_kor", "type": "gsarti/flores_101_kor"}, "metrics": [{"type": "byte_perplexity", "value": 3.932884847226212, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_lao", "type": "gsarti/flores_101_lao"}, "metrics": [{"type": "byte_perplexity", "value": 2.9077314760849924, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_lav", "type": "gsarti/flores_101_lav"}, "metrics": [{"type": "byte_perplexity", "value": 7.777221919194806, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_lin", "type": "gsarti/flores_101_lin"}, "metrics": [{"type": "byte_perplexity", "value": 7.524842908050988, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_lit", "type": "gsarti/flores_101_lit"}, "metrics": [{"type": "byte_perplexity", "value": 7.369179434621725, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_ltz", "type": "gsarti/flores_101_ltz"}, "metrics": [{"type": "byte_perplexity", "value": 8.801059747949214, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_lug", "type": "gsarti/flores_101_lug"}, "metrics": [{"type": "byte_perplexity", "value": 8.483203026364786, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_luo", "type": "gsarti/flores_101_luo"}, "metrics": [{"type": "byte_perplexity", "value": 11.975963093623681, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_mal", "type": "gsarti/flores_101_mal"}, "metrics": [{"type": "byte_perplexity", "value": 4.615948455160037, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_mar", "type": "gsarti/flores_101_mar"}, "metrics": [{"type": "byte_perplexity", "value": 5.483253482821379, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_mkd", "type": "gsarti/flores_101_mkd"}, "metrics": [{"type": "byte_perplexity", "value": 2.9656732291754087, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_mlt", "type": "gsarti/flores_101_mlt"}, "metrics": [{"type": "byte_perplexity", "value": 15.004773437665275, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_mon", "type": "gsarti/flores_101_mon"}, "metrics": [{"type": "byte_perplexity", "value": 3.410598542315402, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_mri", "type": "gsarti/flores_101_mri"}, "metrics": [{"type": "byte_perplexity", "value": 7.474035895661322, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_msa", "type": "gsarti/flores_101_msa"}, "metrics": [{"type": "byte_perplexity", "value": 2.5710001772665634, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_mya", "type": "gsarti/flores_101_mya"}, "metrics": [{"type": "byte_perplexity", "value": 2.413577969878331, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_nld", "type": "gsarti/flores_101_nld"}, "metrics": [{"type": "byte_perplexity", "value": 4.127831721885065, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_nob", "type": "gsarti/flores_101_nob"}, "metrics": [{"type": "byte_perplexity", "value": 5.402763169129877, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_npi", "type": "gsarti/flores_101_npi"}, "metrics": [{"type": "byte_perplexity", "value": 5.199342701937889, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_nso", "type": "gsarti/flores_101_nso"}, "metrics": [{"type": "byte_perplexity", "value": 8.154626800955667, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_nya", "type": "gsarti/flores_101_nya"}, "metrics": [{"type": "byte_perplexity", "value": 8.179860208369393, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_oci", "type": "gsarti/flores_101_oci"}, "metrics": [{"type": "byte_perplexity", "value": 4.8617357393685845, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_orm", "type": "gsarti/flores_101_orm"}, "metrics": [{"type": "byte_perplexity", "value": 12.911595421079408, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_ory", "type": "gsarti/flores_101_ory"}, "metrics": [{"type": "byte_perplexity", "value": 5.189421861225964, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_pan", "type": "gsarti/flores_101_pan"}, "metrics": [{"type": "byte_perplexity", "value": 4.698477289331806, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_pol", "type": "gsarti/flores_101_pol"}, "metrics": [{"type": "byte_perplexity", "value": 4.625550458479643, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_por", "type": "gsarti/flores_101_por"}, "metrics": [{"type": "byte_perplexity", "value": 1.9754515986213523, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_pus", "type": "gsarti/flores_101_pus"}, "metrics": [{"type": "byte_perplexity", "value": 4.4963371422771585, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_ron", "type": "gsarti/flores_101_ron"}, "metrics": [{"type": "byte_perplexity", "value": 4.965456830031304, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_rus", "type": "gsarti/flores_101_rus"}, "metrics": [{"type": "byte_perplexity", "value": 2.0498020542445303, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_slk", "type": "gsarti/flores_101_slk"}, "metrics": [{"type": "byte_perplexity", "value": 6.450822127057479, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_slv", "type": "gsarti/flores_101_slv"}, "metrics": [{"type": "byte_perplexity", "value": 6.620252120186232, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_sna", "type": "gsarti/flores_101_sna"}, "metrics": [{"type": "byte_perplexity", "value": 8.462166771382726, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_snd", "type": "gsarti/flores_101_snd"}, "metrics": [{"type": "byte_perplexity", "value": 5.466066951221973, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_som", "type": "gsarti/flores_101_som"}, "metrics": [{"type": "byte_perplexity", "value": 11.95918054093392, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_spa", "type": "gsarti/flores_101_spa"}, "metrics": [{"type": "byte_perplexity", "value": 1.8965140104323535, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_srp", "type": "gsarti/flores_101_srp"}, "metrics": [{"type": "byte_perplexity", "value": 2.871214785885079, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_swe", "type": "gsarti/flores_101_swe"}, "metrics": [{"type": "byte_perplexity", "value": 5.054972008155866, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_swh", "type": "gsarti/flores_101_swh"}, "metrics": [{"type": "byte_perplexity", "value": 3.6973091886730676, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_tam", "type": "gsarti/flores_101_tam"}, "metrics": [{"type": "byte_perplexity", "value": 4.539493400469833, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_tel", "type": "gsarti/flores_101_tel"}, "metrics": [{"type": "byte_perplexity", "value": 5.807499987508966, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_tgk", "type": "gsarti/flores_101_tgk"}, "metrics": [{"type": "byte_perplexity", "value": 3.5994818827380426, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_tgl", "type": "gsarti/flores_101_tgl"}, "metrics": [{"type": "byte_perplexity", "value": 5.667053833119858, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_tha", "type": "gsarti/flores_101_tha"}, "metrics": [{"type": "byte_perplexity", "value": 2.365940201944242, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_tur", "type": "gsarti/flores_101_tur"}, "metrics": [{"type": "byte_perplexity", "value": 4.885014749844601, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_ukr", "type": "gsarti/flores_101_ukr"}, "metrics": [{"type": "byte_perplexity", "value": 2.7240934990288483, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_umb", "type": "gsarti/flores_101_umb"}, "metrics": [{"type": "byte_perplexity", "value": 12.766915508610673, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_urd", "type": "gsarti/flores_101_urd"}, "metrics": [{"type": "byte_perplexity", "value": 1.9797467071381232, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_uzb", "type": "gsarti/flores_101_uzb"}, "metrics": [{"type": "byte_perplexity", "value": 12.002337637722146, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_vie", "type": "gsarti/flores_101_vie"}, "metrics": [{"type": "byte_perplexity", "value": 1.76578415476397, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_wol", "type": "gsarti/flores_101_wol"}, "metrics": [{"type": "byte_perplexity", "value": 9.144285650306488, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_xho", "type": "gsarti/flores_101_xho"}, "metrics": [{"type": "byte_perplexity", "value": 7.403240538286952, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_yor", "type": "gsarti/flores_101_yor"}, "metrics": [{"type": "byte_perplexity", "value": 5.91272037551173, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_zho_simpl", "type": "gsarti/flores_101_zho_simpl"}, "metrics": [{"type": "byte_perplexity", "value": 2.2769070822768533, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_zho_trad", "type": "gsarti/flores_101_zho_trad"}, "metrics": [{"type": "byte_perplexity", "value": 2.5180582198242383, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "gsarti/flores_101_zul", "type": "gsarti/flores_101_zul"}, "metrics": [{"type": "byte_perplexity", "value": 8.53353320693145, "name": "byte_perplexity", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "headqa", "type": "headqa"}, "metrics": [{"type": "acc", "value": 0.26440554339897887, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "hellaswag", "type": "hellaswag"}, "metrics": [{"type": "acc", "value": 0.41236805417247563, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "logiqa", "type": "logiqa"}, "metrics": [{"type": "acc", "value": 0.2073732718894009, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "mathqa", "type": "mathqa"}, "metrics": [{"type": "acc", "value": 0.24958123953098826, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "mc_taco", "type": "mc_taco"}, "metrics": [{"type": "em", "value": 0.11936936936936937, "name": "em", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "mnli", "type": "mnli"}, "metrics": [{"type": "acc", "value": 0.35496688741721855, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "mnli_mismatched", "type": "mnli_mismatched"}, "metrics": [{"type": "acc", "value": 0.35211554109031734, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "mrpc", "type": "mrpc"}, "metrics": [{"type": "acc", "value": 0.5857843137254902, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "multirc", "type": "multirc"}, "metrics": [{"type": "acc", "value": 0.5375412541254125, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "openbookqa", "type": "openbookqa"}, "metrics": [{"type": "acc", "value": 0.216, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "piqa", "type": "piqa"}, "metrics": [{"type": "acc", "value": 0.7078346028291621, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "prost", "type": "prost"}, "metrics": [{"type": "acc", "value": 0.22683603757472245, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "pubmedqa", "type": "pubmedqa"}, "metrics": [{"type": "acc", "value": 0.616, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "qnli", "type": "qnli"}, "metrics": [{"type": "acc", "value": 0.5072304594545122, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "qqp", "type": "qqp"}, "metrics": [{"type": "acc", "value": 0.3842443729903537, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "race", "type": "race"}, "metrics": [{"type": "acc", "value": 0.3521531100478469, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "rte", "type": "rte"}, "metrics": [{"type": "acc", "value": 0.47653429602888087, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "sciq", "type": "sciq"}, "metrics": [{"type": "acc", "value": 0.892, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "sst", "type": "sst"}, "metrics": [{"type": "acc", "value": 0.5177752293577982, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "triviaqa", "type": "triviaqa"}, "metrics": [{"type": "acc", "value": 0.041633518960487934, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "tydiqa_primary", "type": "tydiqa_primary"}, "metrics": [{"type": "acc", "value": 0.3011337608795236, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "webqs", "type": "webqs"}, "metrics": [{"type": "acc", "value": 0.01673228346456693, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "wic", "type": "wic"}, "metrics": [{"type": "acc", "value": 0.5015673981191222, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "winogrande", "type": "winogrande"}, "metrics": [{"type": "acc", "value": 0.5864246250986582, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "wnli", "type": "wnli"}, "metrics": [{"type": "acc", "value": 0.471830985915493, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "wsc", "type": "wsc"}, "metrics": [{"type": "acc", "value": 0.4423076923076923, "name": "acc", "verified": false}]}, {"task": {"type": "text-generation", "name": "text generation"}, "dataset": {"name": "humaneval", "type": "humaneval"}, "metrics": [{"type": "pass@1", "value": 0.15524390243902436, "name": "pass@1", "verified": false}, {"type": "pass@10", "value": 0.3220367632383857, "name": "pass@10", "verified": false}, {"type": "pass@100", "value": 0.5545431515723145, "name": "pass@100", "verified": false}]}]}]} |
twadada/mpn | twadada | null | [
"mteb",
"model-index",
"region:us"
]
| 2024-09-08T13:47:44 | 2024-09-08T13:48:00 | 0 | 0 | ---
tags:
- mteb
model-index:
- name: mpnet_main
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: None
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 68.11940298507461
- type: ap
value: 30.542146596139048
- type: f1
value: 61.92465989589396
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: None
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 62.503875
- type: ap
value: 58.0577571607728
- type: f1
value: 62.34928241469865
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: None
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 31.66999999999999
- type: f1
value: 31.2458385101798
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: None
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 23.329
- type: map_at_10
value: 37.384
- type: map_at_100
value: 38.57
- type: map_at_1000
value: 38.586999999999996
- type: map_at_3
value: 32.492
- type: map_at_5
value: 35.376000000000005
- type: mrr_at_1
value: 23.755000000000003
- type: mrr_at_10
value: 37.547000000000004
- type: mrr_at_100
value: 38.733000000000004
- type: mrr_at_1000
value: 38.749
- type: mrr_at_3
value: 32.658
- type: mrr_at_5
value: 35.567
- type: ndcg_at_1
value: 23.329
- type: ndcg_at_10
value: 45.574999999999996
- type: ndcg_at_100
value: 50.953
- type: ndcg_at_1000
value: 51.354
- type: ndcg_at_3
value: 35.608000000000004
- type: ndcg_at_5
value: 40.784
- type: precision_at_1
value: 23.329
- type: precision_at_10
value: 7.183000000000001
- type: precision_at_100
value: 0.962
- type: precision_at_1000
value: 0.099
- type: precision_at_3
value: 14.889
- type: precision_at_5
value: 11.437
- type: recall_at_1
value: 23.329
- type: recall_at_10
value: 71.83500000000001
- type: recall_at_100
value: 96.15899999999999
- type: recall_at_1000
value: 99.21799999999999
- type: recall_at_3
value: 44.666
- type: recall_at_5
value: 57.18299999999999
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: None
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 37.14558539727219
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: None
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 27.2291028039137
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: None
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 55.286820678107716
- type: mrr
value: 69.43762916062084
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: None
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 81.32170750010246
- type: cos_sim_spearman
value: 78.48130632209363
- type: euclidean_pearson
value: 80.42696573048755
- type: euclidean_spearman
value: 78.48130632209363
- type: manhattan_pearson
value: 80.68662655318546
- type: manhattan_spearman
value: 78.4475706136436
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: None
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 73.05194805194805
- type: f1
value: 72.31114319146532
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: None
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 33.842006272885655
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: None
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 25.238443830234058
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: None
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 22.647000000000002
- type: map_at_10
value: 29.842999999999996
- type: map_at_100
value: 31.131999999999998
- type: map_at_1000
value: 31.289
- type: map_at_3
value: 27.534999999999997
- type: map_at_5
value: 28.782999999999998
- type: mrr_at_1
value: 28.183000000000003
- type: mrr_at_10
value: 35.225
- type: mrr_at_100
value: 36.128
- type: mrr_at_1000
value: 36.198
- type: mrr_at_3
value: 33.19
- type: mrr_at_5
value: 34.363
- type: ndcg_at_1
value: 28.183000000000003
- type: ndcg_at_10
value: 34.644000000000005
- type: ndcg_at_100
value: 40.194
- type: ndcg_at_1000
value: 43.289
- type: ndcg_at_3
value: 31.259999999999998
- type: ndcg_at_5
value: 32.707
- type: precision_at_1
value: 28.183000000000003
- type: precision_at_10
value: 6.666999999999999
- type: precision_at_100
value: 1.187
- type: precision_at_1000
value: 0.185
- type: precision_at_3
value: 14.974000000000002
- type: precision_at_5
value: 10.844
- type: recall_at_1
value: 22.647000000000002
- type: recall_at_10
value: 42.792
- type: recall_at_100
value: 67.399
- type: recall_at_1000
value: 88.646
- type: recall_at_3
value: 32.535
- type: recall_at_5
value: 36.748999999999995
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: None
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 15.895000000000001
- type: map_at_10
value: 21.631
- type: map_at_100
value: 22.56
- type: map_at_1000
value: 22.689
- type: map_at_3
value: 19.799
- type: map_at_5
value: 20.824
- type: mrr_at_1
value: 20.191
- type: mrr_at_10
value: 25.674999999999997
- type: mrr_at_100
value: 26.482
- type: mrr_at_1000
value: 26.558
- type: mrr_at_3
value: 23.854
- type: mrr_at_5
value: 24.85
- type: ndcg_at_1
value: 20.191
- type: ndcg_at_10
value: 25.428
- type: ndcg_at_100
value: 29.799999999999997
- type: ndcg_at_1000
value: 32.927
- type: ndcg_at_3
value: 22.284000000000002
- type: ndcg_at_5
value: 23.699
- type: precision_at_1
value: 20.191
- type: precision_at_10
value: 4.7829999999999995
- type: precision_at_100
value: 0.876
- type: precision_at_1000
value: 0.14200000000000002
- type: precision_at_3
value: 10.743
- type: precision_at_5
value: 7.720000000000001
- type: recall_at_1
value: 15.895000000000001
- type: recall_at_10
value: 32.789
- type: recall_at_100
value: 52.156000000000006
- type: recall_at_1000
value: 73.804
- type: recall_at_3
value: 23.589
- type: recall_at_5
value: 27.486
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: None
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 25.465
- type: map_at_10
value: 33.93
- type: map_at_100
value: 35.07
- type: map_at_1000
value: 35.165
- type: map_at_3
value: 31.091
- type: map_at_5
value: 32.722
- type: mrr_at_1
value: 29.654999999999998
- type: mrr_at_10
value: 37.156
- type: mrr_at_100
value: 38.074000000000005
- type: mrr_at_1000
value: 38.132
- type: mrr_at_3
value: 34.608
- type: mrr_at_5
value: 36.077999999999996
- type: ndcg_at_1
value: 29.654999999999998
- type: ndcg_at_10
value: 38.872
- type: ndcg_at_100
value: 44.293
- type: ndcg_at_1000
value: 46.455999999999996
- type: ndcg_at_3
value: 33.661
- type: ndcg_at_5
value: 36.237
- type: precision_at_1
value: 29.654999999999998
- type: precision_at_10
value: 6.464
- type: precision_at_100
value: 1.012
- type: precision_at_1000
value: 0.127
- type: precision_at_3
value: 14.943000000000001
- type: precision_at_5
value: 10.696
- type: recall_at_1
value: 25.465
- type: recall_at_10
value: 50.8
- type: recall_at_100
value: 75.373
- type: recall_at_1000
value: 91.053
- type: recall_at_3
value: 36.808
- type: recall_at_5
value: 43.069
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: None
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 12.853
- type: map_at_10
value: 18.047
- type: map_at_100
value: 18.9
- type: map_at_1000
value: 19.017999999999997
- type: map_at_3
value: 16.325
- type: map_at_5
value: 17.281
- type: mrr_at_1
value: 14.124
- type: mrr_at_10
value: 19.344
- type: mrr_at_100
value: 20.194000000000003
- type: mrr_at_1000
value: 20.298
- type: mrr_at_3
value: 17.589
- type: mrr_at_5
value: 18.601
- type: ndcg_at_1
value: 14.124
- type: ndcg_at_10
value: 21.188000000000002
- type: ndcg_at_100
value: 25.856
- type: ndcg_at_1000
value: 29.275000000000002
- type: ndcg_at_3
value: 17.726
- type: ndcg_at_5
value: 19.397000000000002
- type: precision_at_1
value: 14.124
- type: precision_at_10
value: 3.379
- type: precision_at_100
value: 0.61
- type: precision_at_1000
value: 0.095
- type: precision_at_3
value: 7.608
- type: precision_at_5
value: 5.537
- type: recall_at_1
value: 12.853
- type: recall_at_10
value: 29.731999999999996
- type: recall_at_100
value: 51.99399999999999
- type: recall_at_1000
value: 78.581
- type: recall_at_3
value: 20.339
- type: recall_at_5
value: 24.304000000000002
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: None
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 6.736000000000001
- type: map_at_10
value: 10.587
- type: map_at_100
value: 11.515
- type: map_at_1000
value: 11.633000000000001
- type: map_at_3
value: 9.24
- type: map_at_5
value: 9.856
- type: mrr_at_1
value: 8.955
- type: mrr_at_10
value: 13.383999999999999
- type: mrr_at_100
value: 14.297
- type: mrr_at_1000
value: 14.391000000000002
- type: mrr_at_3
value: 11.92
- type: mrr_at_5
value: 12.584999999999999
- type: ndcg_at_1
value: 8.955
- type: ndcg_at_10
value: 13.498
- type: ndcg_at_100
value: 18.684
- type: ndcg_at_1000
value: 22.105
- type: ndcg_at_3
value: 10.881
- type: ndcg_at_5
value: 11.824
- type: precision_at_1
value: 8.955
- type: precision_at_10
value: 2.662
- type: precision_at_100
value: 0.633
- type: precision_at_1000
value: 0.106
- type: precision_at_3
value: 5.473
- type: precision_at_5
value: 3.9800000000000004
- type: recall_at_1
value: 6.736000000000001
- type: recall_at_10
value: 19.945
- type: recall_at_100
value: 43.807
- type: recall_at_1000
value: 69.215
- type: recall_at_3
value: 12.458
- type: recall_at_5
value: 14.878
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: None
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 19.169
- type: map_at_10
value: 25.34
- type: map_at_100
value: 26.509
- type: map_at_1000
value: 26.663999999999998
- type: map_at_3
value: 22.964000000000002
- type: map_at_5
value: 24.229
- type: mrr_at_1
value: 23.483999999999998
- type: mrr_at_10
value: 29.872
- type: mrr_at_100
value: 30.775999999999996
- type: mrr_at_1000
value: 30.858
- type: mrr_at_3
value: 27.43
- type: mrr_at_5
value: 28.782000000000004
- type: ndcg_at_1
value: 23.483999999999998
- type: ndcg_at_10
value: 29.859
- type: ndcg_at_100
value: 35.498000000000005
- type: ndcg_at_1000
value: 38.875
- type: ndcg_at_3
value: 25.635
- type: ndcg_at_5
value: 27.522000000000002
- type: precision_at_1
value: 23.483999999999998
- type: precision_at_10
value: 5.573
- type: precision_at_100
value: 1.002
- type: precision_at_1000
value: 0.15
- type: precision_at_3
value: 11.902
- type: precision_at_5
value: 8.72
- type: recall_at_1
value: 19.169
- type: recall_at_10
value: 38.991
- type: recall_at_100
value: 64.13600000000001
- type: recall_at_1000
value: 87.45
- type: recall_at_3
value: 27.053
- type: recall_at_5
value: 31.996999999999996
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: None
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 13.791999999999998
- type: map_at_10
value: 19.362
- type: map_at_100
value: 20.51
- type: map_at_1000
value: 20.663999999999998
- type: map_at_3
value: 17.408
- type: map_at_5
value: 18.373
- type: mrr_at_1
value: 17.122999999999998
- type: mrr_at_10
value: 22.939
- type: mrr_at_100
value: 23.913999999999998
- type: mrr_at_1000
value: 24.016000000000002
- type: mrr_at_3
value: 20.871000000000002
- type: mrr_at_5
value: 22.019
- type: ndcg_at_1
value: 17.122999999999998
- type: ndcg_at_10
value: 23.219
- type: ndcg_at_100
value: 28.610999999999997
- type: ndcg_at_1000
value: 32.361000000000004
- type: ndcg_at_3
value: 19.657
- type: ndcg_at_5
value: 21.153
- type: precision_at_1
value: 17.122999999999998
- type: precision_at_10
value: 4.3950000000000005
- type: precision_at_100
value: 0.852
- type: precision_at_1000
value: 0.136
- type: precision_at_3
value: 9.399000000000001
- type: precision_at_5
value: 6.963
- type: recall_at_1
value: 13.791999999999998
- type: recall_at_10
value: 31.407
- type: recall_at_100
value: 54.69199999999999
- type: recall_at_1000
value: 81.281
- type: recall_at_3
value: 21.253
- type: recall_at_5
value: 25.22
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: mteb/cqadupstack
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 14.433916666666665
- type: map_at_10
value: 19.892166666666668
- type: map_at_100
value: 20.87308333333333
- type: map_at_1000
value: 21.008416666666665
- type: map_at_3
value: 18.058666666666667
- type: map_at_5
value: 19.015583333333336
- type: mrr_at_1
value: 17.51
- type: mrr_at_10
value: 23.03275
- type: mrr_at_100
value: 23.89025
- type: mrr_at_1000
value: 23.980333333333334
- type: mrr_at_3
value: 21.20616666666667
- type: mrr_at_5
value: 22.195833333333333
- type: ndcg_at_1
value: 17.51
- type: ndcg_at_10
value: 23.55825
- type: ndcg_at_100
value: 28.414249999999996
- type: ndcg_at_1000
value: 31.749083333333328
- type: ndcg_at_3
value: 20.22475
- type: ndcg_at_5
value: 21.668916666666664
- type: precision_at_1
value: 17.51
- type: precision_at_10
value: 4.271333333333334
- type: precision_at_100
value: 0.8016666666666666
- type: precision_at_1000
value: 0.12825
- type: precision_at_3
value: 9.423833333333336
- type: precision_at_5
value: 6.818833333333334
- type: recall_at_1
value: 14.433916666666665
- type: recall_at_10
value: 31.521166666666662
- type: recall_at_100
value: 53.71125
- type: recall_at_1000
value: 77.92325000000001
- type: recall_at_3
value: 22.02575
- type: recall_at_5
value: 25.789916666666663
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: None
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 11.074
- type: map_at_10
value: 15.728
- type: map_at_100
value: 16.442999999999998
- type: map_at_1000
value: 16.536
- type: map_at_3
value: 14.082
- type: map_at_5
value: 14.808
- type: mrr_at_1
value: 12.883
- type: mrr_at_10
value: 17.687
- type: mrr_at_100
value: 18.436
- type: mrr_at_1000
value: 18.515
- type: mrr_at_3
value: 16.181
- type: mrr_at_5
value: 16.84
- type: ndcg_at_1
value: 12.883
- type: ndcg_at_10
value: 18.778
- type: ndcg_at_100
value: 22.817999999999998
- type: ndcg_at_1000
value: 25.657999999999998
- type: ndcg_at_3
value: 15.606
- type: ndcg_at_5
value: 16.727
- type: precision_at_1
value: 12.883
- type: precision_at_10
value: 3.2520000000000002
- type: precision_at_100
value: 0.5780000000000001
- type: precision_at_1000
value: 0.089
- type: precision_at_3
value: 7.156999999999999
- type: precision_at_5
value: 5.061
- type: recall_at_1
value: 11.074
- type: recall_at_10
value: 26.479999999999997
- type: recall_at_100
value: 45.61
- type: recall_at_1000
value: 67.586
- type: recall_at_3
value: 17.377000000000002
- type: recall_at_5
value: 20.238
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: None
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 7.303999999999999
- type: map_at_10
value: 10.779
- type: map_at_100
value: 11.484
- type: map_at_1000
value: 11.616
- type: map_at_3
value: 9.62
- type: map_at_5
value: 10.263
- type: mrr_at_1
value: 9.394
- type: mrr_at_10
value: 13.22
- type: mrr_at_100
value: 13.924
- type: mrr_at_1000
value: 14.032
- type: mrr_at_3
value: 11.912
- type: mrr_at_5
value: 12.671
- type: ndcg_at_1
value: 9.394
- type: ndcg_at_10
value: 13.276
- type: ndcg_at_100
value: 17.118
- type: ndcg_at_1000
value: 20.878
- type: ndcg_at_3
value: 11.084
- type: ndcg_at_5
value: 12.113999999999999
- type: precision_at_1
value: 9.394
- type: precision_at_10
value: 2.533
- type: precision_at_100
value: 0.538
- type: precision_at_1000
value: 0.10300000000000001
- type: precision_at_3
value: 5.391
- type: precision_at_5
value: 4.0329999999999995
- type: recall_at_1
value: 7.303999999999999
- type: recall_at_10
value: 18.523999999999997
- type: recall_at_100
value: 36.452
- type: recall_at_1000
value: 64.38199999999999
- type: recall_at_3
value: 12.366000000000001
- type: recall_at_5
value: 14.994
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: None
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 12.963
- type: map_at_10
value: 17.46
- type: map_at_100
value: 18.297
- type: map_at_1000
value: 18.428
- type: map_at_3
value: 15.709999999999999
- type: map_at_5
value: 16.551
- type: mrr_at_1
value: 15.485
- type: mrr_at_10
value: 20.501
- type: mrr_at_100
value: 21.278
- type: mrr_at_1000
value: 21.379
- type: mrr_at_3
value: 18.548000000000002
- type: mrr_at_5
value: 19.537
- type: ndcg_at_1
value: 15.485
- type: ndcg_at_10
value: 20.994
- type: ndcg_at_100
value: 25.506
- type: ndcg_at_1000
value: 29.022
- type: ndcg_at_3
value: 17.410999999999998
- type: ndcg_at_5
value: 18.808
- type: precision_at_1
value: 15.485
- type: precision_at_10
value: 3.666
- type: precision_at_100
value: 0.662
- type: precision_at_1000
value: 0.109
- type: precision_at_3
value: 7.898
- type: precision_at_5
value: 5.672
- type: recall_at_1
value: 12.963
- type: recall_at_10
value: 29.201
- type: recall_at_100
value: 50.109
- type: recall_at_1000
value: 75.797
- type: recall_at_3
value: 18.989
- type: recall_at_5
value: 22.601
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: None
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 15.260000000000002
- type: map_at_10
value: 21.165
- type: map_at_100
value: 22.400000000000002
- type: map_at_1000
value: 22.612
- type: map_at_3
value: 19.427
- type: map_at_5
value: 20.312
- type: mrr_at_1
value: 19.368
- type: mrr_at_10
value: 25.148
- type: mrr_at_100
value: 26.143
- type: mrr_at_1000
value: 26.235000000000003
- type: mrr_at_3
value: 23.584
- type: mrr_at_5
value: 24.433
- type: ndcg_at_1
value: 19.368
- type: ndcg_at_10
value: 25.239
- type: ndcg_at_100
value: 30.509999999999998
- type: ndcg_at_1000
value: 34.326
- type: ndcg_at_3
value: 22.57
- type: ndcg_at_5
value: 23.668
- type: precision_at_1
value: 19.368
- type: precision_at_10
value: 4.9799999999999995
- type: precision_at_100
value: 1.117
- type: precision_at_1000
value: 0.201
- type: precision_at_3
value: 11.067
- type: precision_at_5
value: 7.904999999999999
- type: recall_at_1
value: 15.260000000000002
- type: recall_at_10
value: 32.368
- type: recall_at_100
value: 56.908
- type: recall_at_1000
value: 82.708
- type: recall_at_3
value: 23.816000000000003
- type: recall_at_5
value: 27.191
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval
type: None
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 10.049
- type: map_at_10
value: 14.834
- type: map_at_100
value: 15.656999999999998
- type: map_at_1000
value: 15.787
- type: map_at_3
value: 13.503000000000002
- type: map_at_5
value: 14.185
- type: mrr_at_1
value: 11.275
- type: mrr_at_10
value: 16.242
- type: mrr_at_100
value: 17.037
- type: mrr_at_1000
value: 17.152
- type: mrr_at_3
value: 14.787
- type: mrr_at_5
value: 15.591
- type: ndcg_at_1
value: 11.275
- type: ndcg_at_10
value: 17.704
- type: ndcg_at_100
value: 22.083
- type: ndcg_at_1000
value: 25.817
- type: ndcg_at_3
value: 14.921999999999999
- type: ndcg_at_5
value: 16.171
- type: precision_at_1
value: 11.275
- type: precision_at_10
value: 2.902
- type: precision_at_100
value: 0.553
- type: precision_at_1000
value: 0.096
- type: precision_at_3
value: 6.531000000000001
- type: precision_at_5
value: 4.695
- type: recall_at_1
value: 10.049
- type: recall_at_10
value: 25.224999999999998
- type: recall_at_100
value: 45.899
- type: recall_at_1000
value: 74.576
- type: recall_at_3
value: 17.726
- type: recall_at_5
value: 20.752000000000002
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: None
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 6.762
- type: map_at_10
value: 12.867
- type: map_at_100
value: 14.478
- type: map_at_1000
value: 14.696000000000002
- type: map_at_3
value: 10.437000000000001
- type: map_at_5
value: 11.689
- type: mrr_at_1
value: 15.309000000000001
- type: mrr_at_10
value: 25.839000000000002
- type: mrr_at_100
value: 26.994
- type: mrr_at_1000
value: 27.056
- type: mrr_at_3
value: 22.400000000000002
- type: mrr_at_5
value: 24.451999999999998
- type: ndcg_at_1
value: 15.309000000000001
- type: ndcg_at_10
value: 19.384999999999998
- type: ndcg_at_100
value: 26.517000000000003
- type: ndcg_at_1000
value: 30.676
- type: ndcg_at_3
value: 14.876000000000001
- type: ndcg_at_5
value: 16.611
- type: precision_at_1
value: 15.309000000000001
- type: precision_at_10
value: 6.489000000000001
- type: precision_at_100
value: 1.409
- type: precision_at_1000
value: 0.217
- type: precision_at_3
value: 11.530999999999999
- type: precision_at_5
value: 9.381
- type: recall_at_1
value: 6.762
- type: recall_at_10
value: 24.996
- type: recall_at_100
value: 50.202999999999996
- type: recall_at_1000
value: 73.87899999999999
- type: recall_at_3
value: 14.149000000000001
- type: recall_at_5
value: 18.648
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: None
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 3.846
- type: map_at_10
value: 9.048
- type: map_at_100
value: 12.656
- type: map_at_1000
value: 13.605999999999998
- type: map_at_3
value: 6.293
- type: map_at_5
value: 7.5920000000000005
- type: mrr_at_1
value: 41.5
- type: mrr_at_10
value: 51.08200000000001
- type: mrr_at_100
value: 51.82299999999999
- type: mrr_at_1000
value: 51.856
- type: mrr_at_3
value: 48.5
- type: mrr_at_5
value: 49.836999999999996
- type: ndcg_at_1
value: 30.375000000000004
- type: ndcg_at_10
value: 23.343
- type: ndcg_at_100
value: 26.261000000000003
- type: ndcg_at_1000
value: 33.053
- type: ndcg_at_3
value: 25.814999999999998
- type: ndcg_at_5
value: 24.583
- type: precision_at_1
value: 41.5
- type: precision_at_10
value: 20.849999999999998
- type: precision_at_100
value: 6.635000000000001
- type: precision_at_1000
value: 1.438
- type: precision_at_3
value: 30.833
- type: precision_at_5
value: 26.85
- type: recall_at_1
value: 3.846
- type: recall_at_10
value: 13.83
- type: recall_at_100
value: 32.757999999999996
- type: recall_at_1000
value: 56.25
- type: recall_at_3
value: 7.574
- type: recall_at_5
value: 10.071
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: None
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 46.62
- type: f1
value: 42.79767018915584
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: None
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 15.677
- type: map_at_10
value: 23.551
- type: map_at_100
value: 24.442
- type: map_at_1000
value: 24.514
- type: map_at_3
value: 21.192
- type: map_at_5
value: 22.499
- type: mrr_at_1
value: 16.742
- type: mrr_at_10
value: 25.019000000000002
- type: mrr_at_100
value: 25.894000000000002
- type: mrr_at_1000
value: 25.958
- type: mrr_at_3
value: 22.555
- type: mrr_at_5
value: 23.909
- type: ndcg_at_1
value: 16.742
- type: ndcg_at_10
value: 28.247
- type: ndcg_at_100
value: 32.797
- type: ndcg_at_1000
value: 34.809
- type: ndcg_at_3
value: 23.358999999999998
- type: ndcg_at_5
value: 25.705
- type: precision_at_1
value: 16.742
- type: precision_at_10
value: 4.532
- type: precision_at_100
value: 0.701
- type: precision_at_1000
value: 0.089
- type: precision_at_3
value: 10.145999999999999
- type: precision_at_5
value: 7.342
- type: recall_at_1
value: 15.677
- type: recall_at_10
value: 41.571000000000005
- type: recall_at_100
value: 62.834999999999994
- type: recall_at_1000
value: 78.387
- type: recall_at_3
value: 28.214
- type: recall_at_5
value: 33.891
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: None
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 7.5520000000000005
- type: map_at_10
value: 12.378
- type: map_at_100
value: 13.572999999999999
- type: map_at_1000
value: 13.79
- type: map_at_3
value: 10.737
- type: map_at_5
value: 11.629000000000001
- type: mrr_at_1
value: 15.123000000000001
- type: mrr_at_10
value: 21.001
- type: mrr_at_100
value: 22.112000000000002
- type: mrr_at_1000
value: 22.21
- type: mrr_at_3
value: 19.264
- type: mrr_at_5
value: 20.166999999999998
- type: ndcg_at_1
value: 15.123000000000001
- type: ndcg_at_10
value: 16.699
- type: ndcg_at_100
value: 22.688
- type: ndcg_at_1000
value: 27.394000000000002
- type: ndcg_at_3
value: 14.516000000000002
- type: ndcg_at_5
value: 15.336
- type: precision_at_1
value: 15.123000000000001
- type: precision_at_10
value: 4.7219999999999995
- type: precision_at_100
value: 1.065
- type: precision_at_1000
value: 0.188
- type: precision_at_3
value: 9.825000000000001
- type: precision_at_5
value: 7.284
- type: recall_at_1
value: 7.5520000000000005
- type: recall_at_10
value: 20.887
- type: recall_at_100
value: 44.613
- type: recall_at_1000
value: 73.55699999999999
- type: recall_at_3
value: 13.715
- type: recall_at_5
value: 16.75
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: None
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 15.314
- type: map_at_10
value: 21.73
- type: map_at_100
value: 22.595000000000002
- type: map_at_1000
value: 22.7
- type: map_at_3
value: 19.914
- type: map_at_5
value: 20.891000000000002
- type: mrr_at_1
value: 30.628
- type: mrr_at_10
value: 37.302
- type: mrr_at_100
value: 38.04
- type: mrr_at_1000
value: 38.102999999999994
- type: mrr_at_3
value: 35.445
- type: mrr_at_5
value: 36.464999999999996
- type: ndcg_at_1
value: 30.628
- type: ndcg_at_10
value: 27.986
- type: ndcg_at_100
value: 32.103
- type: ndcg_at_1000
value: 34.739
- type: ndcg_at_3
value: 24.48
- type: ndcg_at_5
value: 26.125
- type: precision_at_1
value: 30.628
- type: precision_at_10
value: 6.243
- type: precision_at_100
value: 0.955
- type: precision_at_1000
value: 0.131
- type: precision_at_3
value: 15.517
- type: precision_at_5
value: 10.613999999999999
- type: recall_at_1
value: 15.314
- type: recall_at_10
value: 31.215
- type: recall_at_100
value: 47.752
- type: recall_at_1000
value: 65.422
- type: recall_at_3
value: 23.275000000000002
- type: recall_at_5
value: 26.535999999999998
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: None
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 61.661200000000015
- type: ap
value: 57.26137842361126
- type: f1
value: 61.44069729315865
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: None
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: map_at_1
value: 6.308999999999999
- type: map_at_10
value: 11.003
- type: map_at_100
value: 11.865
- type: map_at_1000
value: 11.974
- type: map_at_3
value: 9.309000000000001
- type: map_at_5
value: 10.145999999999999
- type: mrr_at_1
value: 6.5329999999999995
- type: mrr_at_10
value: 11.296000000000001
- type: mrr_at_100
value: 12.168
- type: mrr_at_1000
value: 12.273
- type: mrr_at_3
value: 9.582
- type: mrr_at_5
value: 10.42
- type: ndcg_at_1
value: 6.519
- type: ndcg_at_10
value: 13.998
- type: ndcg_at_100
value: 18.701
- type: ndcg_at_1000
value: 21.944
- type: ndcg_at_3
value: 10.383000000000001
- type: ndcg_at_5
value: 11.898
- type: precision_at_1
value: 6.519
- type: precision_at_10
value: 2.4330000000000003
- type: precision_at_100
value: 0.486
- type: precision_at_1000
value: 0.077
- type: precision_at_3
value: 4.585
- type: precision_at_5
value: 3.5130000000000003
- type: recall_at_1
value: 6.308999999999999
- type: recall_at_10
value: 23.381
- type: recall_at_100
value: 46.25
- type: recall_at_1000
value: 72.261
- type: recall_at_3
value: 13.239
- type: recall_at_5
value: 16.902
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: None
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 87.84313725490198
- type: f1
value: 87.24204022782286
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: None
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 56.409028727770185
- type: f1
value: 38.57449573016968
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: None
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 62.010759919300604
- type: f1
value: 60.290520300650584
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: None
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 70.65232010759918
- type: f1
value: 69.36104886302014
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: None
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 30.364401278066065
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: None
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 28.00495863318603
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: None
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 30.917670435424853
- type: mrr
value: 31.929615376181395
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: None
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 4.279
- type: map_at_10
value: 8.491999999999999
- type: map_at_100
value: 10.969
- type: map_at_1000
value: 12.396
- type: map_at_3
value: 6.254999999999999
- type: map_at_5
value: 7.417
- type: mrr_at_1
value: 34.056
- type: mrr_at_10
value: 43.877
- type: mrr_at_100
value: 44.590999999999994
- type: mrr_at_1000
value: 44.651
- type: mrr_at_3
value: 41.382999999999996
- type: mrr_at_5
value: 42.838
- type: ndcg_at_1
value: 32.198
- type: ndcg_at_10
value: 25.971
- type: ndcg_at_100
value: 25.112000000000002
- type: ndcg_at_1000
value: 34.83
- type: ndcg_at_3
value: 29.018
- type: ndcg_at_5
value: 28.447
- type: precision_at_1
value: 34.056
- type: precision_at_10
value: 19.412
- type: precision_at_100
value: 7.053
- type: precision_at_1000
value: 2.061
- type: precision_at_3
value: 27.761000000000003
- type: precision_at_5
value: 25.076999999999998
- type: recall_at_1
value: 4.279
- type: recall_at_10
value: 12.917000000000002
- type: recall_at_100
value: 27.386
- type: recall_at_1000
value: 62.90599999999999
- type: recall_at_3
value: 7.234999999999999
- type: recall_at_5
value: 9.866
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: None
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 8.427
- type: map_at_10
value: 14.471
- type: map_at_100
value: 15.704
- type: map_at_1000
value: 15.809000000000001
- type: map_at_3
value: 12.059000000000001
- type: map_at_5
value: 13.288
- type: mrr_at_1
value: 9.647
- type: mrr_at_10
value: 16.064999999999998
- type: mrr_at_100
value: 17.212
- type: mrr_at_1000
value: 17.297
- type: mrr_at_3
value: 13.562
- type: mrr_at_5
value: 14.843
- type: ndcg_at_1
value: 9.647
- type: ndcg_at_10
value: 18.613
- type: ndcg_at_100
value: 24.834999999999997
- type: ndcg_at_1000
value: 27.716
- type: ndcg_at_3
value: 13.605
- type: ndcg_at_5
value: 15.797
- type: precision_at_1
value: 9.647
- type: precision_at_10
value: 3.531
- type: precision_at_100
value: 0.7060000000000001
- type: precision_at_1000
value: 0.098
- type: precision_at_3
value: 6.431000000000001
- type: precision_at_5
value: 5.093
- type: recall_at_1
value: 8.427
- type: recall_at_10
value: 29.995
- type: recall_at_100
value: 58.760999999999996
- type: recall_at_1000
value: 81.033
- type: recall_at_3
value: 16.621
- type: recall_at_5
value: 21.69
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: None
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 63.709
- type: map_at_10
value: 76.66
- type: map_at_100
value: 77.444
- type: map_at_1000
value: 77.474
- type: map_at_3
value: 73.639
- type: map_at_5
value: 75.495
- type: mrr_at_1
value: 73.42
- type: mrr_at_10
value: 80.643
- type: mrr_at_100
value: 80.886
- type: mrr_at_1000
value: 80.891
- type: mrr_at_3
value: 79.163
- type: mrr_at_5
value: 80.132
- type: ndcg_at_1
value: 73.44000000000001
- type: ndcg_at_10
value: 81.26100000000001
- type: ndcg_at_100
value: 83.34
- type: ndcg_at_1000
value: 83.65599999999999
- type: ndcg_at_3
value: 77.593
- type: ndcg_at_5
value: 79.552
- type: precision_at_1
value: 73.44000000000001
- type: precision_at_10
value: 12.356
- type: precision_at_100
value: 1.472
- type: precision_at_1000
value: 0.155
- type: precision_at_3
value: 33.733000000000004
- type: precision_at_5
value: 22.398
- type: recall_at_1
value: 63.709
- type: recall_at_10
value: 90.24
- type: recall_at_100
value: 97.992
- type: recall_at_1000
value: 99.725
- type: recall_at_3
value: 79.843
- type: recall_at_5
value: 85.199
- type: map_at_1
value: 3.098
- type: map_at_10
value: 7.359999999999999
- type: map_at_100
value: 8.888
- type: map_at_1000
value: 9.158
- type: map_at_3
value: 5.406
- type: map_at_5
value: 6.308999999999999
- type: mrr_at_1
value: 15.2
- type: mrr_at_10
value: 23.508000000000003
- type: mrr_at_100
value: 24.709
- type: mrr_at_1000
value: 24.787
- type: mrr_at_3
value: 20.383000000000003
- type: mrr_at_5
value: 22.103
- type: ndcg_at_1
value: 15.2
- type: ndcg_at_10
value: 13.174
- type: ndcg_at_100
value: 19.885
- type: ndcg_at_1000
value: 25.247999999999998
- type: ndcg_at_3
value: 12.242
- type: ndcg_at_5
value: 10.702
- type: precision_at_1
value: 15.2
- type: precision_at_10
value: 6.93
- type: precision_at_100
value: 1.6709999999999998
- type: precision_at_1000
value: 0.296
- type: precision_at_3
value: 11.4
- type: precision_at_5
value: 9.379999999999999
- type: recall_at_1
value: 3.098
- type: recall_at_10
value: 14.048
- type: recall_at_100
value: 33.902
- type: recall_at_1000
value: 60.17
- type: recall_at_3
value: 6.9430000000000005
- type: recall_at_5
value: 9.498
- type: map_at_1
value: 0.125
- type: map_at_10
value: 0.86
- type: map_at_100
value: 4.665
- type: map_at_1000
value: 11.877
- type: map_at_3
value: 0.299
- type: map_at_5
value: 0.47200000000000003
- type: mrr_at_1
value: 50.0
- type: mrr_at_10
value: 64.711
- type: mrr_at_100
value: 65.065
- type: mrr_at_1000
value: 65.065
- type: mrr_at_3
value: 62.0
- type: mrr_at_5
value: 62.9
- type: ndcg_at_1
value: 43.0
- type: ndcg_at_10
value: 43.147999999999996
- type: ndcg_at_100
value: 33.417
- type: ndcg_at_1000
value: 31.341
- type: ndcg_at_3
value: 43.653999999999996
- type: ndcg_at_5
value: 43.21
- type: precision_at_1
value: 50.0
- type: precision_at_10
value: 48.199999999999996
- type: precision_at_100
value: 35.46
- type: precision_at_1000
value: 15.342
- type: precision_at_3
value: 48.0
- type: precision_at_5
value: 47.599999999999994
- type: recall_at_1
value: 0.125
- type: recall_at_10
value: 1.145
- type: recall_at_100
value: 7.727
- type: recall_at_1000
value: 30.742000000000004
- type: recall_at_3
value: 0.356
- type: recall_at_5
value: 0.5780000000000001
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: None
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 42.214155529412366
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: None
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 48.10171269080449
- task:
type: STS
dataset:
name: MTEB SICK-R
type: None
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 76.69196724733715
- type: cos_sim_spearman
value: 65.00669029968084
- type: euclidean_pearson
value: 71.35623218354901
- type: euclidean_spearman
value: 65.00662504036774
- type: manhattan_pearson
value: 69.46286814034032
- type: manhattan_spearman
value: 64.05091703970768
- task:
type: STS
dataset:
name: MTEB STS12
type: None
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 75.45675254280496
- type: cos_sim_spearman
value: 67.48465522195806
- type: euclidean_pearson
value: 71.932572180082
- type: euclidean_spearman
value: 67.48597260989263
- type: manhattan_pearson
value: 70.01381315407934
- type: manhattan_spearman
value: 66.83129276722313
- task:
type: STS
dataset:
name: MTEB STS13
type: None
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 75.56784955823615
- type: cos_sim_spearman
value: 77.1656947836492
- type: euclidean_pearson
value: 76.86159714478943
- type: euclidean_spearman
value: 77.16570697849755
- type: manhattan_pearson
value: 77.05983226779968
- type: manhattan_spearman
value: 77.43229771628044
- task:
type: STS
dataset:
name: MTEB STS14
type: None
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 77.28801641888653
- type: cos_sim_spearman
value: 72.72947194978411
- type: euclidean_pearson
value: 76.2115552769551
- type: euclidean_spearman
value: 72.72946226092458
- type: manhattan_pearson
value: 75.19019262864614
- type: manhattan_spearman
value: 72.18378967267259
- task:
type: STS
dataset:
name: MTEB STS15
type: None
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 79.73471725204746
- type: cos_sim_spearman
value: 80.79015625826382
- type: euclidean_pearson
value: 80.81110611872813
- type: euclidean_spearman
value: 80.79016252191039
- type: manhattan_pearson
value: 79.93979968573043
- type: manhattan_spearman
value: 80.07556394648903
- task:
type: STS
dataset:
name: MTEB STS16
type: None
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 74.47923638473124
- type: cos_sim_spearman
value: 75.71286196807024
- type: euclidean_pearson
value: 75.83804880943377
- type: euclidean_spearman
value: 75.71341236422742
- type: manhattan_pearson
value: 75.93646913049322
- type: manhattan_spearman
value: 75.85181752457555
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: None
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 82.62219071209913
- type: cos_sim_spearman
value: 83.44167690000958
- type: euclidean_pearson
value: 83.28214784087085
- type: euclidean_spearman
value: 83.44255138870209
- type: manhattan_pearson
value: 82.77261607066816
- type: manhattan_spearman
value: 83.06899474864443
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: None
config: en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 64.70345108985259
- type: cos_sim_spearman
value: 62.482753044620786
- type: euclidean_pearson
value: 64.79437494489187
- type: euclidean_spearman
value: 62.482753044620786
- type: manhattan_pearson
value: 63.71939825347573
- type: manhattan_spearman
value: 61.174953862000336
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: None
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 76.07865440954043
- type: cos_sim_spearman
value: 74.54667758834077
- type: euclidean_pearson
value: 76.48558570428264
- type: euclidean_spearman
value: 74.54672598094477
- type: manhattan_pearson
value: 76.06256712227383
- type: manhattan_spearman
value: 74.42758128821515
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: None
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 75.15143418949978
- type: mrr
value: 91.98409705762647
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: None
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 33.417
- type: map_at_10
value: 42.594
- type: map_at_100
value: 43.535000000000004
- type: map_at_1000
value: 43.6
- type: map_at_3
value: 39.759
- type: map_at_5
value: 41.506
- type: mrr_at_1
value: 35.667
- type: mrr_at_10
value: 44.446000000000005
- type: mrr_at_100
value: 45.244
- type: mrr_at_1000
value: 45.300000000000004
- type: mrr_at_3
value: 42.167
- type: mrr_at_5
value: 43.5
- type: ndcg_at_1
value: 35.667
- type: ndcg_at_10
value: 47.591
- type: ndcg_at_100
value: 52.611
- type: ndcg_at_1000
value: 54.31
- type: ndcg_at_3
value: 42.356
- type: ndcg_at_5
value: 45.194
- type: precision_at_1
value: 35.667
- type: precision_at_10
value: 6.7669999999999995
- type: precision_at_100
value: 0.967
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_3
value: 16.889000000000003
- type: precision_at_5
value: 11.799999999999999
- type: recall_at_1
value: 33.417
- type: recall_at_10
value: 61.260999999999996
- type: recall_at_100
value: 85.556
- type: recall_at_1000
value: 98.867
- type: recall_at_3
value: 47.528
- type: recall_at_5
value: 54.388999999999996
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: None
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.73267326732673
- type: cos_sim_ap
value: 92.36951341438333
- type: cos_sim_f1
value: 86.04073522106309
- type: cos_sim_precision
value: 85.48864758144127
- type: cos_sim_recall
value: 86.6
- type: dot_accuracy
value: 99.73267326732673
- type: dot_ap
value: 92.36951341438333
- type: dot_f1
value: 86.04073522106309
- type: dot_precision
value: 85.48864758144127
- type: dot_recall
value: 86.6
- type: euclidean_accuracy
value: 99.73267326732673
- type: euclidean_ap
value: 92.36951341438333
- type: euclidean_f1
value: 86.04073522106309
- type: euclidean_precision
value: 85.48864758144127
- type: euclidean_recall
value: 86.6
- type: manhattan_accuracy
value: 99.74455445544554
- type: manhattan_ap
value: 92.96894184904977
- type: manhattan_f1
value: 86.8917576961271
- type: manhattan_precision
value: 86.29191321499013
- type: manhattan_recall
value: 87.5
- type: max_accuracy
value: 99.74455445544554
- type: max_ap
value: 92.96894184904977
- type: max_f1
value: 86.8917576961271
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: None
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 45.349940718460374
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: None
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 31.266631844140036
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: None
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 42.02550203348626
- type: mrr
value: 42.442651302945414
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: None
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.22842420698354
- type: cos_sim_spearman
value: 30.568909812744543
- type: dot_pearson
value: 30.228424144316747
- type: dot_spearman
value: 30.619692862283827
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: None
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 1.585
- type: map_at_10
value: 7.398000000000001
- type: map_at_100
value: 13.603000000000002
- type: map_at_1000
value: 15.267
- type: map_at_3
value: 3.857
- type: map_at_5
value: 5.509
- type: mrr_at_1
value: 24.490000000000002
- type: mrr_at_10
value: 39.883
- type: mrr_at_100
value: 41.082
- type: mrr_at_1000
value: 41.082
- type: mrr_at_3
value: 35.034
- type: mrr_at_5
value: 37.483
- type: ndcg_at_1
value: 23.469
- type: ndcg_at_10
value: 21.221999999999998
- type: ndcg_at_100
value: 34.851
- type: ndcg_at_1000
value: 46.26
- type: ndcg_at_3
value: 21.906
- type: ndcg_at_5
value: 21.229
- type: precision_at_1
value: 24.490000000000002
- type: precision_at_10
value: 19.796
- type: precision_at_100
value: 8.122
- type: precision_at_1000
value: 1.541
- type: precision_at_3
value: 23.810000000000002
- type: precision_at_5
value: 22.041
- type: recall_at_1
value: 1.585
- type: recall_at_10
value: 13.664000000000001
- type: recall_at_100
value: 49.559
- type: recall_at_1000
value: 83.978
- type: recall_at_3
value: 5.088
- type: recall_at_5
value: 8.203000000000001
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: None
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 71.68520000000001
- type: ap
value: 14.622321024533974
- type: f1
value: 55.1924859473184
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: None
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 53.34748160724392
- type: f1
value: 53.518629300332755
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: None
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 40.22582442073446
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: None
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 82.46408773916671
- type: cos_sim_ap
value: 60.57612839124909
- type: cos_sim_f1
value: 58.366606170598914
- type: cos_sim_precision
value: 53.899441340782126
- type: cos_sim_recall
value: 63.641160949868066
- type: dot_accuracy
value: 82.46408773916671
- type: dot_ap
value: 60.57612839124909
- type: dot_f1
value: 58.366606170598914
- type: dot_precision
value: 53.899441340782126
- type: dot_recall
value: 63.641160949868066
- type: euclidean_accuracy
value: 82.46408773916671
- type: euclidean_ap
value: 60.57612839124909
- type: euclidean_f1
value: 58.366606170598914
- type: euclidean_precision
value: 53.899441340782126
- type: euclidean_recall
value: 63.641160949868066
- type: manhattan_accuracy
value: 81.68921738093819
- type: manhattan_ap
value: 58.62502289564927
- type: manhattan_f1
value: 57.40318906605921
- type: manhattan_precision
value: 50.50100200400801
- type: manhattan_recall
value: 66.49076517150397
- type: max_accuracy
value: 82.46408773916671
- type: max_ap
value: 60.57612839124909
- type: max_f1
value: 58.366606170598914
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: None
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 86.89602980556526
- type: cos_sim_ap
value: 81.92992391915341
- type: cos_sim_f1
value: 74.31139877741819
- type: cos_sim_precision
value: 69.71393873971124
- type: cos_sim_recall
value: 79.55805358792732
- type: dot_accuracy
value: 86.89602980556526
- type: dot_ap
value: 81.92992407440505
- type: dot_f1
value: 74.31139877741819
- type: dot_precision
value: 69.71393873971124
- type: dot_recall
value: 79.55805358792732
- type: euclidean_accuracy
value: 86.89602980556526
- type: euclidean_ap
value: 81.92992329073074
- type: euclidean_f1
value: 74.31139877741819
- type: euclidean_precision
value: 69.71393873971124
- type: euclidean_recall
value: 79.55805358792732
- type: manhattan_accuracy
value: 86.94454146776886
- type: manhattan_ap
value: 81.96535237136042
- type: manhattan_f1
value: 74.41181834761991
- type: manhattan_precision
value: 70.70076939072572
- type: manhattan_recall
value: 78.53403141361257
- type: max_accuracy
value: 86.94454146776886
- type: max_ap
value: 81.96535237136042
- type: max_f1
value: 74.41181834761991
---
| [
"SUMMARIZATION"
]
| [
"BIOSSES",
"SCIFACT"
]
| Non_BioNLP | {"tags": ["mteb"], "model-index": [{"name": "mpnet_main", "results": [{"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonCounterfactualClassification (en)", "type": "None", "config": "en", "split": "test", "revision": "e8379541af4e31359cca9fbcf4b00f2671dba205"}, "metrics": [{"type": "accuracy", "value": 68.11940298507461}, {"type": "ap", "value": 30.542146596139048}, {"type": "f1", "value": 61.92465989589396}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonPolarityClassification", "type": "None", "config": "default", "split": "test", "revision": "e2d317d38cd51312af73b3d32a06d1a08b442046"}, "metrics": [{"type": "accuracy", "value": 62.503875}, {"type": "ap", "value": 58.0577571607728}, {"type": "f1", "value": 62.34928241469865}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification (en)", "type": "None", "config": "en", "split": "test", "revision": "1399c76144fd37290681b995c656ef9b2e06e26d"}, "metrics": [{"type": "accuracy", "value": 31.66999999999999}, {"type": "f1", "value": 31.2458385101798}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ArguAna", "type": "None", "config": "default", "split": "test", "revision": "c22ab2a51041ffd869aaddef7af8d8215647e41a"}, "metrics": [{"type": "map_at_1", "value": 23.329}, {"type": "map_at_10", "value": 37.384}, {"type": "map_at_100", "value": 38.57}, {"type": "map_at_1000", "value": 38.586999999999996}, {"type": "map_at_3", "value": 32.492}, {"type": "map_at_5", "value": 35.376000000000005}, {"type": "mrr_at_1", "value": 23.755000000000003}, {"type": "mrr_at_10", "value": 37.547000000000004}, {"type": "mrr_at_100", "value": 38.733000000000004}, {"type": "mrr_at_1000", "value": 38.749}, {"type": "mrr_at_3", "value": 32.658}, {"type": "mrr_at_5", "value": 35.567}, {"type": "ndcg_at_1", "value": 23.329}, {"type": "ndcg_at_10", "value": 45.574999999999996}, {"type": "ndcg_at_100", "value": 50.953}, {"type": "ndcg_at_1000", "value": 51.354}, {"type": "ndcg_at_3", "value": 35.608000000000004}, {"type": "ndcg_at_5", "value": 40.784}, {"type": "precision_at_1", "value": 23.329}, {"type": "precision_at_10", "value": 7.183000000000001}, {"type": "precision_at_100", "value": 0.962}, {"type": "precision_at_1000", "value": 0.099}, {"type": "precision_at_3", "value": 14.889}, {"type": "precision_at_5", "value": 11.437}, {"type": "recall_at_1", "value": 23.329}, {"type": "recall_at_10", "value": 71.83500000000001}, {"type": "recall_at_100", "value": 96.15899999999999}, {"type": "recall_at_1000", "value": 99.21799999999999}, {"type": "recall_at_3", "value": 44.666}, {"type": "recall_at_5", "value": 57.18299999999999}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringP2P", "type": "None", "config": "default", "split": "test", "revision": "a122ad7f3f0291bf49cc6f4d32aa80929df69d5d"}, "metrics": [{"type": "v_measure", "value": 37.14558539727219}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringS2S", "type": "None", "config": "default", "split": "test", "revision": "f910caf1a6075f7329cdf8c1a6135696f37dbd53"}, "metrics": [{"type": "v_measure", "value": 27.2291028039137}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB AskUbuntuDupQuestions", "type": "None", "config": "default", "split": "test", "revision": "2000358ca161889fa9c082cb41daa8dcfb161a54"}, "metrics": [{"type": "map", "value": 55.286820678107716}, {"type": "mrr", "value": 69.43762916062084}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB BIOSSES", "type": "None", "config": "default", "split": "test", "revision": "d3fb88f8f02e40887cd149695127462bbcf29b4a"}, "metrics": [{"type": "cos_sim_pearson", "value": 81.32170750010246}, {"type": "cos_sim_spearman", "value": 78.48130632209363}, {"type": "euclidean_pearson", "value": 80.42696573048755}, {"type": "euclidean_spearman", "value": 78.48130632209363}, {"type": "manhattan_pearson", "value": 80.68662655318546}, {"type": "manhattan_spearman", "value": 78.4475706136436}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB Banking77Classification", "type": "None", "config": "default", "split": "test", "revision": "0fd18e25b25c072e09e0d92ab615fda904d66300"}, "metrics": [{"type": "accuracy", "value": 73.05194805194805}, {"type": "f1", "value": 72.31114319146532}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringP2P", "type": "None", "config": "default", "split": "test", "revision": "65b79d1d13f80053f67aca9498d9402c2d9f1f40"}, "metrics": [{"type": "v_measure", "value": 33.842006272885655}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringS2S", "type": "None", "config": "default", "split": "test", "revision": "258694dd0231531bc1fd9de6ceb52a0853c6d908"}, "metrics": [{"type": "v_measure", "value": 25.238443830234058}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackAndroidRetrieval", "type": "None", "config": "default", "split": "test", "revision": "f46a197baaae43b4f621051089b82a364682dfeb"}, "metrics": [{"type": "map_at_1", "value": 22.647000000000002}, {"type": "map_at_10", "value": 29.842999999999996}, {"type": "map_at_100", "value": 31.131999999999998}, {"type": "map_at_1000", "value": 31.289}, {"type": "map_at_3", "value": 27.534999999999997}, {"type": "map_at_5", "value": 28.782999999999998}, {"type": "mrr_at_1", "value": 28.183000000000003}, {"type": "mrr_at_10", "value": 35.225}, {"type": "mrr_at_100", "value": 36.128}, {"type": "mrr_at_1000", "value": 36.198}, {"type": "mrr_at_3", "value": 33.19}, {"type": "mrr_at_5", "value": 34.363}, {"type": "ndcg_at_1", "value": 28.183000000000003}, {"type": "ndcg_at_10", "value": 34.644000000000005}, {"type": "ndcg_at_100", "value": 40.194}, {"type": "ndcg_at_1000", "value": 43.289}, {"type": "ndcg_at_3", "value": 31.259999999999998}, {"type": "ndcg_at_5", "value": 32.707}, {"type": "precision_at_1", "value": 28.183000000000003}, {"type": "precision_at_10", "value": 6.666999999999999}, {"type": "precision_at_100", "value": 1.187}, {"type": "precision_at_1000", "value": 0.185}, {"type": "precision_at_3", "value": 14.974000000000002}, {"type": "precision_at_5", "value": 10.844}, {"type": "recall_at_1", "value": 22.647000000000002}, {"type": "recall_at_10", "value": 42.792}, {"type": "recall_at_100", "value": 67.399}, {"type": "recall_at_1000", "value": 88.646}, {"type": "recall_at_3", "value": 32.535}, {"type": "recall_at_5", "value": 36.748999999999995}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackEnglishRetrieval", "type": "None", "config": "default", "split": "test", "revision": "ad9991cb51e31e31e430383c75ffb2885547b5f0"}, "metrics": [{"type": "map_at_1", "value": 15.895000000000001}, {"type": "map_at_10", "value": 21.631}, {"type": "map_at_100", "value": 22.56}, {"type": "map_at_1000", "value": 22.689}, {"type": "map_at_3", "value": 19.799}, {"type": "map_at_5", "value": 20.824}, {"type": "mrr_at_1", "value": 20.191}, {"type": "mrr_at_10", "value": 25.674999999999997}, {"type": "mrr_at_100", "value": 26.482}, {"type": "mrr_at_1000", "value": 26.558}, {"type": "mrr_at_3", "value": 23.854}, {"type": "mrr_at_5", "value": 24.85}, {"type": "ndcg_at_1", "value": 20.191}, {"type": "ndcg_at_10", "value": 25.428}, {"type": "ndcg_at_100", "value": 29.799999999999997}, {"type": "ndcg_at_1000", "value": 32.927}, {"type": "ndcg_at_3", "value": 22.284000000000002}, {"type": "ndcg_at_5", "value": 23.699}, {"type": "precision_at_1", "value": 20.191}, {"type": "precision_at_10", "value": 4.7829999999999995}, {"type": "precision_at_100", "value": 0.876}, {"type": "precision_at_1000", "value": 0.14200000000000002}, {"type": "precision_at_3", "value": 10.743}, {"type": "precision_at_5", "value": 7.720000000000001}, {"type": "recall_at_1", "value": 15.895000000000001}, {"type": "recall_at_10", "value": 32.789}, {"type": "recall_at_100", "value": 52.156000000000006}, {"type": "recall_at_1000", "value": 73.804}, {"type": "recall_at_3", "value": 23.589}, {"type": "recall_at_5", "value": 27.486}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGamingRetrieval", "type": "None", "config": "default", "split": "test", "revision": "4885aa143210c98657558c04aaf3dc47cfb54340"}, "metrics": [{"type": "map_at_1", "value": 25.465}, {"type": "map_at_10", "value": 33.93}, {"type": "map_at_100", "value": 35.07}, {"type": "map_at_1000", "value": 35.165}, {"type": "map_at_3", "value": 31.091}, {"type": "map_at_5", "value": 32.722}, {"type": "mrr_at_1", "value": 29.654999999999998}, {"type": "mrr_at_10", "value": 37.156}, {"type": "mrr_at_100", "value": 38.074000000000005}, {"type": "mrr_at_1000", "value": 38.132}, {"type": "mrr_at_3", "value": 34.608}, {"type": "mrr_at_5", "value": 36.077999999999996}, {"type": "ndcg_at_1", "value": 29.654999999999998}, {"type": "ndcg_at_10", "value": 38.872}, {"type": "ndcg_at_100", "value": 44.293}, {"type": "ndcg_at_1000", "value": 46.455999999999996}, {"type": "ndcg_at_3", "value": 33.661}, {"type": "ndcg_at_5", "value": 36.237}, {"type": "precision_at_1", "value": 29.654999999999998}, {"type": "precision_at_10", "value": 6.464}, {"type": "precision_at_100", "value": 1.012}, {"type": "precision_at_1000", "value": 0.127}, {"type": "precision_at_3", "value": 14.943000000000001}, {"type": "precision_at_5", "value": 10.696}, {"type": "recall_at_1", "value": 25.465}, {"type": "recall_at_10", "value": 50.8}, {"type": "recall_at_100", "value": 75.373}, {"type": "recall_at_1000", "value": 91.053}, {"type": "recall_at_3", "value": 36.808}, {"type": "recall_at_5", "value": 43.069}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGisRetrieval", "type": "None", "config": "default", "split": "test", "revision": "5003b3064772da1887988e05400cf3806fe491f2"}, "metrics": [{"type": "map_at_1", "value": 12.853}, {"type": "map_at_10", "value": 18.047}, {"type": "map_at_100", "value": 18.9}, {"type": "map_at_1000", "value": 19.017999999999997}, {"type": "map_at_3", "value": 16.325}, {"type": "map_at_5", "value": 17.281}, {"type": "mrr_at_1", "value": 14.124}, {"type": "mrr_at_10", "value": 19.344}, {"type": "mrr_at_100", "value": 20.194000000000003}, {"type": "mrr_at_1000", "value": 20.298}, {"type": "mrr_at_3", "value": 17.589}, {"type": "mrr_at_5", "value": 18.601}, {"type": "ndcg_at_1", "value": 14.124}, {"type": "ndcg_at_10", "value": 21.188000000000002}, {"type": "ndcg_at_100", "value": 25.856}, {"type": "ndcg_at_1000", "value": 29.275000000000002}, {"type": "ndcg_at_3", "value": 17.726}, {"type": "ndcg_at_5", "value": 19.397000000000002}, {"type": "precision_at_1", "value": 14.124}, {"type": "precision_at_10", "value": 3.379}, {"type": "precision_at_100", "value": 0.61}, {"type": "precision_at_1000", "value": 0.095}, {"type": "precision_at_3", "value": 7.608}, {"type": "precision_at_5", "value": 5.537}, {"type": "recall_at_1", "value": 12.853}, {"type": "recall_at_10", "value": 29.731999999999996}, {"type": "recall_at_100", "value": 51.99399999999999}, {"type": "recall_at_1000", "value": 78.581}, {"type": "recall_at_3", "value": 20.339}, {"type": "recall_at_5", "value": 24.304000000000002}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackMathematicaRetrieval", "type": "None", "config": "default", "split": "test", "revision": "90fceea13679c63fe563ded68f3b6f06e50061de"}, "metrics": [{"type": "map_at_1", "value": 6.736000000000001}, {"type": "map_at_10", "value": 10.587}, {"type": "map_at_100", "value": 11.515}, {"type": "map_at_1000", "value": 11.633000000000001}, {"type": "map_at_3", "value": 9.24}, {"type": "map_at_5", "value": 9.856}, {"type": "mrr_at_1", "value": 8.955}, {"type": "mrr_at_10", "value": 13.383999999999999}, {"type": "mrr_at_100", "value": 14.297}, {"type": "mrr_at_1000", "value": 14.391000000000002}, {"type": "mrr_at_3", "value": 11.92}, {"type": "mrr_at_5", "value": 12.584999999999999}, {"type": "ndcg_at_1", "value": 8.955}, {"type": "ndcg_at_10", "value": 13.498}, {"type": "ndcg_at_100", "value": 18.684}, {"type": "ndcg_at_1000", "value": 22.105}, {"type": "ndcg_at_3", "value": 10.881}, {"type": "ndcg_at_5", "value": 11.824}, {"type": "precision_at_1", "value": 8.955}, {"type": "precision_at_10", "value": 2.662}, {"type": "precision_at_100", "value": 0.633}, {"type": "precision_at_1000", "value": 0.106}, {"type": "precision_at_3", "value": 5.473}, {"type": "precision_at_5", "value": 3.9800000000000004}, {"type": "recall_at_1", "value": 6.736000000000001}, {"type": "recall_at_10", "value": 19.945}, {"type": "recall_at_100", "value": 43.807}, {"type": "recall_at_1000", "value": 69.215}, {"type": "recall_at_3", "value": 12.458}, {"type": "recall_at_5", "value": 14.878}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackPhysicsRetrieval", "type": "None", "config": "default", "split": "test", "revision": "79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4"}, "metrics": [{"type": "map_at_1", "value": 19.169}, {"type": "map_at_10", "value": 25.34}, {"type": "map_at_100", "value": 26.509}, {"type": "map_at_1000", "value": 26.663999999999998}, {"type": "map_at_3", "value": 22.964000000000002}, {"type": "map_at_5", "value": 24.229}, {"type": "mrr_at_1", "value": 23.483999999999998}, {"type": "mrr_at_10", "value": 29.872}, {"type": "mrr_at_100", "value": 30.775999999999996}, {"type": "mrr_at_1000", "value": 30.858}, {"type": "mrr_at_3", "value": 27.43}, {"type": "mrr_at_5", "value": 28.782000000000004}, {"type": "ndcg_at_1", "value": 23.483999999999998}, {"type": "ndcg_at_10", "value": 29.859}, {"type": "ndcg_at_100", "value": 35.498000000000005}, {"type": "ndcg_at_1000", "value": 38.875}, {"type": "ndcg_at_3", "value": 25.635}, {"type": "ndcg_at_5", "value": 27.522000000000002}, {"type": "precision_at_1", "value": 23.483999999999998}, {"type": "precision_at_10", "value": 5.573}, {"type": "precision_at_100", "value": 1.002}, {"type": "precision_at_1000", "value": 0.15}, {"type": "precision_at_3", "value": 11.902}, {"type": "precision_at_5", "value": 8.72}, {"type": "recall_at_1", "value": 19.169}, {"type": "recall_at_10", "value": 38.991}, {"type": "recall_at_100", "value": 64.13600000000001}, {"type": "recall_at_1000", "value": 87.45}, {"type": "recall_at_3", "value": 27.053}, {"type": "recall_at_5", "value": 31.996999999999996}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackProgrammersRetrieval", "type": "None", "config": "default", "split": "test", "revision": "6184bc1440d2dbc7612be22b50686b8826d22b32"}, "metrics": [{"type": "map_at_1", "value": 13.791999999999998}, {"type": "map_at_10", "value": 19.362}, {"type": "map_at_100", "value": 20.51}, {"type": "map_at_1000", "value": 20.663999999999998}, {"type": "map_at_3", "value": 17.408}, {"type": "map_at_5", "value": 18.373}, {"type": "mrr_at_1", "value": 17.122999999999998}, {"type": "mrr_at_10", "value": 22.939}, {"type": "mrr_at_100", "value": 23.913999999999998}, {"type": "mrr_at_1000", "value": 24.016000000000002}, {"type": "mrr_at_3", "value": 20.871000000000002}, {"type": "mrr_at_5", "value": 22.019}, {"type": "ndcg_at_1", "value": 17.122999999999998}, {"type": "ndcg_at_10", "value": 23.219}, {"type": "ndcg_at_100", "value": 28.610999999999997}, {"type": "ndcg_at_1000", "value": 32.361000000000004}, {"type": "ndcg_at_3", "value": 19.657}, {"type": "ndcg_at_5", "value": 21.153}, {"type": "precision_at_1", "value": 17.122999999999998}, {"type": "precision_at_10", "value": 4.3950000000000005}, {"type": "precision_at_100", "value": 0.852}, {"type": "precision_at_1000", "value": 0.136}, {"type": "precision_at_3", "value": 9.399000000000001}, {"type": "precision_at_5", "value": 6.963}, {"type": "recall_at_1", "value": 13.791999999999998}, {"type": "recall_at_10", "value": 31.407}, {"type": "recall_at_100", "value": 54.69199999999999}, {"type": "recall_at_1000", "value": 81.281}, {"type": "recall_at_3", "value": 21.253}, {"type": "recall_at_5", "value": 25.22}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackRetrieval", "type": "mteb/cqadupstack", "config": "default", "split": "test", "revision": "4885aa143210c98657558c04aaf3dc47cfb54340"}, "metrics": [{"type": "map_at_1", "value": 14.433916666666665}, {"type": "map_at_10", "value": 19.892166666666668}, {"type": "map_at_100", "value": 20.87308333333333}, {"type": "map_at_1000", "value": 21.008416666666665}, {"type": "map_at_3", "value": 18.058666666666667}, {"type": "map_at_5", "value": 19.015583333333336}, {"type": "mrr_at_1", "value": 17.51}, {"type": "mrr_at_10", "value": 23.03275}, {"type": "mrr_at_100", "value": 23.89025}, {"type": "mrr_at_1000", "value": 23.980333333333334}, {"type": "mrr_at_3", "value": 21.20616666666667}, {"type": "mrr_at_5", "value": 22.195833333333333}, {"type": "ndcg_at_1", "value": 17.51}, {"type": "ndcg_at_10", "value": 23.55825}, {"type": "ndcg_at_100", "value": 28.414249999999996}, {"type": "ndcg_at_1000", "value": 31.749083333333328}, {"type": "ndcg_at_3", "value": 20.22475}, {"type": "ndcg_at_5", "value": 21.668916666666664}, {"type": "precision_at_1", "value": 17.51}, {"type": "precision_at_10", "value": 4.271333333333334}, {"type": "precision_at_100", "value": 0.8016666666666666}, {"type": "precision_at_1000", "value": 0.12825}, {"type": "precision_at_3", "value": 9.423833333333336}, {"type": "precision_at_5", "value": 6.818833333333334}, {"type": "recall_at_1", "value": 14.433916666666665}, {"type": "recall_at_10", "value": 31.521166666666662}, {"type": "recall_at_100", "value": 53.71125}, {"type": "recall_at_1000", "value": 77.92325000000001}, {"type": "recall_at_3", "value": 22.02575}, {"type": "recall_at_5", "value": 25.789916666666663}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackStatsRetrieval", "type": "None", "config": "default", "split": "test", "revision": "65ac3a16b8e91f9cee4c9828cc7c335575432a2a"}, "metrics": [{"type": "map_at_1", "value": 11.074}, {"type": "map_at_10", "value": 15.728}, {"type": "map_at_100", "value": 16.442999999999998}, {"type": "map_at_1000", "value": 16.536}, {"type": "map_at_3", "value": 14.082}, {"type": "map_at_5", "value": 14.808}, {"type": "mrr_at_1", "value": 12.883}, {"type": "mrr_at_10", "value": 17.687}, {"type": "mrr_at_100", "value": 18.436}, {"type": "mrr_at_1000", "value": 18.515}, {"type": "mrr_at_3", "value": 16.181}, {"type": "mrr_at_5", "value": 16.84}, {"type": "ndcg_at_1", "value": 12.883}, {"type": "ndcg_at_10", "value": 18.778}, {"type": "ndcg_at_100", "value": 22.817999999999998}, {"type": "ndcg_at_1000", "value": 25.657999999999998}, {"type": "ndcg_at_3", "value": 15.606}, {"type": "ndcg_at_5", "value": 16.727}, {"type": "precision_at_1", "value": 12.883}, {"type": "precision_at_10", "value": 3.2520000000000002}, {"type": "precision_at_100", "value": 0.5780000000000001}, {"type": "precision_at_1000", "value": 0.089}, {"type": "precision_at_3", "value": 7.156999999999999}, {"type": "precision_at_5", "value": 5.061}, {"type": "recall_at_1", "value": 11.074}, {"type": "recall_at_10", "value": 26.479999999999997}, {"type": "recall_at_100", "value": 45.61}, {"type": "recall_at_1000", "value": 67.586}, {"type": "recall_at_3", "value": 17.377000000000002}, {"type": "recall_at_5", "value": 20.238}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackTexRetrieval", "type": "None", "config": "default", "split": "test", "revision": "46989137a86843e03a6195de44b09deda022eec7"}, "metrics": [{"type": "map_at_1", "value": 7.303999999999999}, {"type": "map_at_10", "value": 10.779}, {"type": "map_at_100", "value": 11.484}, {"type": "map_at_1000", "value": 11.616}, {"type": "map_at_3", "value": 9.62}, {"type": "map_at_5", "value": 10.263}, {"type": "mrr_at_1", "value": 9.394}, {"type": "mrr_at_10", "value": 13.22}, {"type": "mrr_at_100", "value": 13.924}, {"type": "mrr_at_1000", "value": 14.032}, {"type": "mrr_at_3", "value": 11.912}, {"type": "mrr_at_5", "value": 12.671}, {"type": "ndcg_at_1", "value": 9.394}, {"type": "ndcg_at_10", "value": 13.276}, {"type": "ndcg_at_100", "value": 17.118}, {"type": "ndcg_at_1000", "value": 20.878}, {"type": "ndcg_at_3", "value": 11.084}, {"type": "ndcg_at_5", "value": 12.113999999999999}, {"type": "precision_at_1", "value": 9.394}, {"type": "precision_at_10", "value": 2.533}, {"type": "precision_at_100", "value": 0.538}, {"type": "precision_at_1000", "value": 0.10300000000000001}, {"type": "precision_at_3", "value": 5.391}, {"type": "precision_at_5", "value": 4.0329999999999995}, {"type": "recall_at_1", "value": 7.303999999999999}, {"type": "recall_at_10", "value": 18.523999999999997}, {"type": "recall_at_100", "value": 36.452}, {"type": "recall_at_1000", "value": 64.38199999999999}, {"type": "recall_at_3", "value": 12.366000000000001}, {"type": "recall_at_5", "value": 14.994}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackUnixRetrieval", "type": "None", "config": "default", "split": "test", "revision": "6c6430d3a6d36f8d2a829195bc5dc94d7e063e53"}, "metrics": [{"type": "map_at_1", "value": 12.963}, {"type": "map_at_10", "value": 17.46}, {"type": "map_at_100", "value": 18.297}, {"type": "map_at_1000", "value": 18.428}, {"type": "map_at_3", "value": 15.709999999999999}, {"type": "map_at_5", "value": 16.551}, {"type": "mrr_at_1", "value": 15.485}, {"type": "mrr_at_10", "value": 20.501}, {"type": "mrr_at_100", "value": 21.278}, {"type": "mrr_at_1000", "value": 21.379}, {"type": "mrr_at_3", "value": 18.548000000000002}, {"type": "mrr_at_5", "value": 19.537}, {"type": "ndcg_at_1", "value": 15.485}, {"type": "ndcg_at_10", "value": 20.994}, {"type": "ndcg_at_100", "value": 25.506}, {"type": "ndcg_at_1000", "value": 29.022}, {"type": "ndcg_at_3", "value": 17.410999999999998}, {"type": "ndcg_at_5", "value": 18.808}, {"type": "precision_at_1", "value": 15.485}, {"type": "precision_at_10", "value": 3.666}, {"type": "precision_at_100", "value": 0.662}, {"type": "precision_at_1000", "value": 0.109}, {"type": "precision_at_3", "value": 7.898}, {"type": "precision_at_5", "value": 5.672}, {"type": "recall_at_1", "value": 12.963}, {"type": "recall_at_10", "value": 29.201}, {"type": "recall_at_100", "value": 50.109}, {"type": "recall_at_1000", "value": 75.797}, {"type": "recall_at_3", "value": 18.989}, {"type": "recall_at_5", "value": 22.601}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWebmastersRetrieval", "type": "None", "config": "default", "split": "test", "revision": "160c094312a0e1facb97e55eeddb698c0abe3571"}, "metrics": [{"type": "map_at_1", "value": 15.260000000000002}, {"type": "map_at_10", "value": 21.165}, {"type": "map_at_100", "value": 22.400000000000002}, {"type": "map_at_1000", "value": 22.612}, {"type": "map_at_3", "value": 19.427}, {"type": "map_at_5", "value": 20.312}, {"type": "mrr_at_1", "value": 19.368}, {"type": "mrr_at_10", "value": 25.148}, {"type": "mrr_at_100", "value": 26.143}, {"type": "mrr_at_1000", "value": 26.235000000000003}, {"type": "mrr_at_3", "value": 23.584}, {"type": "mrr_at_5", "value": 24.433}, {"type": "ndcg_at_1", "value": 19.368}, {"type": "ndcg_at_10", "value": 25.239}, {"type": "ndcg_at_100", "value": 30.509999999999998}, {"type": "ndcg_at_1000", "value": 34.326}, {"type": "ndcg_at_3", "value": 22.57}, {"type": "ndcg_at_5", "value": 23.668}, {"type": "precision_at_1", "value": 19.368}, {"type": "precision_at_10", "value": 4.9799999999999995}, {"type": "precision_at_100", "value": 1.117}, {"type": "precision_at_1000", "value": 0.201}, {"type": "precision_at_3", "value": 11.067}, {"type": "precision_at_5", "value": 7.904999999999999}, {"type": "recall_at_1", "value": 15.260000000000002}, {"type": "recall_at_10", "value": 32.368}, {"type": "recall_at_100", "value": 56.908}, {"type": "recall_at_1000", "value": 82.708}, {"type": "recall_at_3", "value": 23.816000000000003}, {"type": "recall_at_5", "value": 27.191}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWordpressRetrieval", "type": "None", "config": "default", "split": "test", "revision": "4ffe81d471b1924886b33c7567bfb200e9eec5c4"}, "metrics": [{"type": "map_at_1", "value": 10.049}, {"type": "map_at_10", "value": 14.834}, {"type": "map_at_100", "value": 15.656999999999998}, {"type": "map_at_1000", "value": 15.787}, {"type": "map_at_3", "value": 13.503000000000002}, {"type": "map_at_5", "value": 14.185}, {"type": "mrr_at_1", "value": 11.275}, {"type": "mrr_at_10", "value": 16.242}, {"type": "mrr_at_100", "value": 17.037}, {"type": "mrr_at_1000", "value": 17.152}, {"type": "mrr_at_3", "value": 14.787}, {"type": "mrr_at_5", "value": 15.591}, {"type": "ndcg_at_1", "value": 11.275}, {"type": "ndcg_at_10", "value": 17.704}, {"type": "ndcg_at_100", "value": 22.083}, {"type": "ndcg_at_1000", "value": 25.817}, {"type": "ndcg_at_3", "value": 14.921999999999999}, {"type": "ndcg_at_5", "value": 16.171}, {"type": "precision_at_1", "value": 11.275}, {"type": "precision_at_10", "value": 2.902}, {"type": "precision_at_100", "value": 0.553}, {"type": "precision_at_1000", "value": 0.096}, {"type": "precision_at_3", "value": 6.531000000000001}, {"type": "precision_at_5", "value": 4.695}, {"type": "recall_at_1", "value": 10.049}, {"type": "recall_at_10", "value": 25.224999999999998}, {"type": "recall_at_100", "value": 45.899}, {"type": "recall_at_1000", "value": 74.576}, {"type": "recall_at_3", "value": 17.726}, {"type": "recall_at_5", "value": 20.752000000000002}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ClimateFEVER", "type": "None", "config": "default", "split": "test", "revision": "47f2ac6acb640fc46020b02a5b59fdda04d39380"}, "metrics": [{"type": "map_at_1", "value": 6.762}, {"type": "map_at_10", "value": 12.867}, {"type": "map_at_100", "value": 14.478}, {"type": "map_at_1000", "value": 14.696000000000002}, {"type": "map_at_3", "value": 10.437000000000001}, {"type": "map_at_5", "value": 11.689}, {"type": "mrr_at_1", "value": 15.309000000000001}, {"type": "mrr_at_10", "value": 25.839000000000002}, {"type": "mrr_at_100", "value": 26.994}, {"type": "mrr_at_1000", "value": 27.056}, {"type": "mrr_at_3", "value": 22.400000000000002}, {"type": "mrr_at_5", "value": 24.451999999999998}, {"type": "ndcg_at_1", "value": 15.309000000000001}, {"type": "ndcg_at_10", "value": 19.384999999999998}, {"type": "ndcg_at_100", "value": 26.517000000000003}, {"type": "ndcg_at_1000", "value": 30.676}, {"type": "ndcg_at_3", "value": 14.876000000000001}, {"type": "ndcg_at_5", "value": 16.611}, {"type": "precision_at_1", "value": 15.309000000000001}, {"type": "precision_at_10", "value": 6.489000000000001}, {"type": "precision_at_100", "value": 1.409}, {"type": "precision_at_1000", "value": 0.217}, {"type": "precision_at_3", "value": 11.530999999999999}, {"type": "precision_at_5", "value": 9.381}, {"type": "recall_at_1", "value": 6.762}, {"type": "recall_at_10", "value": 24.996}, {"type": "recall_at_100", "value": 50.202999999999996}, {"type": "recall_at_1000", "value": 73.87899999999999}, {"type": "recall_at_3", "value": 14.149000000000001}, {"type": "recall_at_5", "value": 18.648}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB DBPedia", "type": "None", "config": "default", "split": "test", "revision": "c0f706b76e590d620bd6618b3ca8efdd34e2d659"}, "metrics": [{"type": "map_at_1", "value": 3.846}, {"type": "map_at_10", "value": 9.048}, {"type": "map_at_100", "value": 12.656}, {"type": "map_at_1000", "value": 13.605999999999998}, {"type": "map_at_3", "value": 6.293}, {"type": "map_at_5", "value": 7.5920000000000005}, {"type": "mrr_at_1", "value": 41.5}, {"type": "mrr_at_10", "value": 51.08200000000001}, {"type": "mrr_at_100", "value": 51.82299999999999}, {"type": "mrr_at_1000", "value": 51.856}, {"type": "mrr_at_3", "value": 48.5}, {"type": "mrr_at_5", "value": 49.836999999999996}, {"type": "ndcg_at_1", "value": 30.375000000000004}, {"type": "ndcg_at_10", "value": 23.343}, {"type": "ndcg_at_100", "value": 26.261000000000003}, {"type": "ndcg_at_1000", "value": 33.053}, {"type": "ndcg_at_3", "value": 25.814999999999998}, {"type": "ndcg_at_5", "value": 24.583}, {"type": "precision_at_1", "value": 41.5}, {"type": "precision_at_10", "value": 20.849999999999998}, {"type": "precision_at_100", "value": 6.635000000000001}, {"type": "precision_at_1000", "value": 1.438}, {"type": "precision_at_3", "value": 30.833}, {"type": "precision_at_5", "value": 26.85}, {"type": "recall_at_1", "value": 3.846}, {"type": "recall_at_10", "value": 13.83}, {"type": "recall_at_100", "value": 32.757999999999996}, {"type": "recall_at_1000", "value": 56.25}, {"type": "recall_at_3", "value": 7.574}, {"type": "recall_at_5", "value": 10.071}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB EmotionClassification", "type": "None", "config": "default", "split": "test", "revision": "4f58c6b202a23cf9a4da393831edf4f9183cad37"}, "metrics": [{"type": "accuracy", "value": 46.62}, {"type": "f1", "value": 42.79767018915584}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FEVER", "type": "None", "config": "default", "split": "test", "revision": "bea83ef9e8fb933d90a2f1d5515737465d613e12"}, "metrics": [{"type": "map_at_1", "value": 15.677}, {"type": "map_at_10", "value": 23.551}, {"type": "map_at_100", "value": 24.442}, {"type": "map_at_1000", "value": 24.514}, {"type": "map_at_3", "value": 21.192}, {"type": "map_at_5", "value": 22.499}, {"type": "mrr_at_1", "value": 16.742}, {"type": "mrr_at_10", "value": 25.019000000000002}, {"type": "mrr_at_100", "value": 25.894000000000002}, {"type": "mrr_at_1000", "value": 25.958}, {"type": "mrr_at_3", "value": 22.555}, {"type": "mrr_at_5", "value": 23.909}, {"type": "ndcg_at_1", "value": 16.742}, {"type": "ndcg_at_10", "value": 28.247}, {"type": "ndcg_at_100", "value": 32.797}, {"type": "ndcg_at_1000", "value": 34.809}, {"type": "ndcg_at_3", "value": 23.358999999999998}, {"type": "ndcg_at_5", "value": 25.705}, {"type": "precision_at_1", "value": 16.742}, {"type": "precision_at_10", "value": 4.532}, {"type": "precision_at_100", "value": 0.701}, {"type": "precision_at_1000", "value": 0.089}, {"type": "precision_at_3", "value": 10.145999999999999}, {"type": "precision_at_5", "value": 7.342}, {"type": "recall_at_1", "value": 15.677}, {"type": "recall_at_10", "value": 41.571000000000005}, {"type": "recall_at_100", "value": 62.834999999999994}, {"type": "recall_at_1000", "value": 78.387}, {"type": "recall_at_3", "value": 28.214}, {"type": "recall_at_5", "value": 33.891}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FiQA2018", "type": "None", "config": "default", "split": "test", "revision": "27a168819829fe9bcd655c2df245fb19452e8e06"}, "metrics": [{"type": "map_at_1", "value": 7.5520000000000005}, {"type": "map_at_10", "value": 12.378}, {"type": "map_at_100", "value": 13.572999999999999}, {"type": "map_at_1000", "value": 13.79}, {"type": "map_at_3", "value": 10.737}, {"type": "map_at_5", "value": 11.629000000000001}, {"type": "mrr_at_1", "value": 15.123000000000001}, {"type": "mrr_at_10", "value": 21.001}, {"type": "mrr_at_100", "value": 22.112000000000002}, {"type": "mrr_at_1000", "value": 22.21}, {"type": "mrr_at_3", "value": 19.264}, {"type": "mrr_at_5", "value": 20.166999999999998}, {"type": "ndcg_at_1", "value": 15.123000000000001}, {"type": "ndcg_at_10", "value": 16.699}, {"type": "ndcg_at_100", "value": 22.688}, {"type": "ndcg_at_1000", "value": 27.394000000000002}, {"type": "ndcg_at_3", "value": 14.516000000000002}, {"type": "ndcg_at_5", "value": 15.336}, {"type": "precision_at_1", "value": 15.123000000000001}, {"type": "precision_at_10", "value": 4.7219999999999995}, {"type": "precision_at_100", "value": 1.065}, {"type": "precision_at_1000", "value": 0.188}, {"type": "precision_at_3", "value": 9.825000000000001}, {"type": "precision_at_5", "value": 7.284}, {"type": "recall_at_1", "value": 7.5520000000000005}, {"type": "recall_at_10", "value": 20.887}, {"type": "recall_at_100", "value": 44.613}, {"type": "recall_at_1000", "value": 73.55699999999999}, {"type": "recall_at_3", "value": 13.715}, {"type": "recall_at_5", "value": 16.75}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB HotpotQA", "type": "None", "config": "default", "split": "test", "revision": "ab518f4d6fcca38d87c25209f94beba119d02014"}, "metrics": [{"type": "map_at_1", "value": 15.314}, {"type": "map_at_10", "value": 21.73}, {"type": "map_at_100", "value": 22.595000000000002}, {"type": "map_at_1000", "value": 22.7}, {"type": "map_at_3", "value": 19.914}, {"type": "map_at_5", "value": 20.891000000000002}, {"type": "mrr_at_1", "value": 30.628}, {"type": "mrr_at_10", "value": 37.302}, {"type": "mrr_at_100", "value": 38.04}, {"type": "mrr_at_1000", "value": 38.102999999999994}, {"type": "mrr_at_3", "value": 35.445}, {"type": "mrr_at_5", "value": 36.464999999999996}, {"type": "ndcg_at_1", "value": 30.628}, {"type": "ndcg_at_10", "value": 27.986}, {"type": "ndcg_at_100", "value": 32.103}, {"type": "ndcg_at_1000", "value": 34.739}, {"type": "ndcg_at_3", "value": 24.48}, {"type": "ndcg_at_5", "value": 26.125}, {"type": "precision_at_1", "value": 30.628}, {"type": "precision_at_10", "value": 6.243}, {"type": "precision_at_100", "value": 0.955}, {"type": "precision_at_1000", "value": 0.131}, {"type": "precision_at_3", "value": 15.517}, {"type": "precision_at_5", "value": 10.613999999999999}, {"type": "recall_at_1", "value": 15.314}, {"type": "recall_at_10", "value": 31.215}, {"type": "recall_at_100", "value": 47.752}, {"type": "recall_at_1000", "value": 65.422}, {"type": "recall_at_3", "value": 23.275000000000002}, {"type": "recall_at_5", "value": 26.535999999999998}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ImdbClassification", "type": "None", "config": "default", "split": "test", "revision": "3d86128a09e091d6018b6d26cad27f2739fc2db7"}, "metrics": [{"type": "accuracy", "value": 61.661200000000015}, {"type": "ap", "value": 57.26137842361126}, {"type": "f1", "value": 61.44069729315865}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MSMARCO", "type": "None", "config": "default", "split": "dev", "revision": "c5a29a104738b98a9e76336939199e264163d4a0"}, "metrics": [{"type": "map_at_1", "value": 6.308999999999999}, {"type": "map_at_10", "value": 11.003}, {"type": "map_at_100", "value": 11.865}, {"type": "map_at_1000", "value": 11.974}, {"type": "map_at_3", "value": 9.309000000000001}, {"type": "map_at_5", "value": 10.145999999999999}, {"type": "mrr_at_1", "value": 6.5329999999999995}, {"type": "mrr_at_10", "value": 11.296000000000001}, {"type": "mrr_at_100", "value": 12.168}, {"type": "mrr_at_1000", "value": 12.273}, {"type": "mrr_at_3", "value": 9.582}, {"type": "mrr_at_5", "value": 10.42}, {"type": "ndcg_at_1", "value": 6.519}, {"type": "ndcg_at_10", "value": 13.998}, {"type": "ndcg_at_100", "value": 18.701}, {"type": "ndcg_at_1000", "value": 21.944}, {"type": "ndcg_at_3", "value": 10.383000000000001}, {"type": "ndcg_at_5", "value": 11.898}, {"type": "precision_at_1", "value": 6.519}, {"type": "precision_at_10", "value": 2.4330000000000003}, {"type": "precision_at_100", "value": 0.486}, {"type": "precision_at_1000", "value": 0.077}, {"type": "precision_at_3", "value": 4.585}, {"type": "precision_at_5", "value": 3.5130000000000003}, {"type": "recall_at_1", "value": 6.308999999999999}, {"type": "recall_at_10", "value": 23.381}, {"type": "recall_at_100", "value": 46.25}, {"type": "recall_at_1000", "value": 72.261}, {"type": "recall_at_3", "value": 13.239}, {"type": "recall_at_5", "value": 16.902}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPDomainClassification (en)", "type": "None", "config": "en", "split": "test", "revision": "d80d48c1eb48d3562165c59d59d0034df9fff0bf"}, "metrics": [{"type": "accuracy", "value": 87.84313725490198}, {"type": "f1", "value": 87.24204022782286}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPIntentClassification (en)", "type": "None", "config": "en", "split": "test", "revision": "ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba"}, "metrics": [{"type": "accuracy", "value": 56.409028727770185}, {"type": "f1", "value": 38.57449573016968}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (en)", "type": "None", "config": "en", "split": "test", "revision": "31efe3c427b0bae9c22cbb560b8f15491cc6bed7"}, "metrics": [{"type": "accuracy", "value": 62.010759919300604}, {"type": "f1", "value": 60.290520300650584}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (en)", "type": "None", "config": "en", "split": "test", "revision": "7d571f92784cd94a019292a1f45445077d0ef634"}, "metrics": [{"type": "accuracy", "value": 70.65232010759918}, {"type": "f1", "value": 69.36104886302014}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringP2P", "type": "None", "config": "default", "split": "test", "revision": "e7a26af6f3ae46b30dde8737f02c07b1505bcc73"}, "metrics": [{"type": "v_measure", "value": 30.364401278066065}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringS2S", "type": "None", "config": "default", "split": "test", "revision": "35191c8c0dca72d8ff3efcd72aa802307d469663"}, "metrics": [{"type": "v_measure", "value": 28.00495863318603}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB MindSmallReranking", "type": "None", "config": "default", "split": "test", "revision": "3bdac13927fdc888b903db93b2ffdbd90b295a69"}, "metrics": [{"type": "map", "value": 30.917670435424853}, {"type": "mrr", "value": 31.929615376181395}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NFCorpus", "type": "None", "config": "default", "split": "test", "revision": "ec0fa4fe99da2ff19ca1214b7966684033a58814"}, "metrics": [{"type": "map_at_1", "value": 4.279}, {"type": "map_at_10", "value": 8.491999999999999}, {"type": "map_at_100", "value": 10.969}, {"type": "map_at_1000", "value": 12.396}, {"type": "map_at_3", "value": 6.254999999999999}, {"type": "map_at_5", "value": 7.417}, {"type": "mrr_at_1", "value": 34.056}, {"type": "mrr_at_10", "value": 43.877}, {"type": "mrr_at_100", "value": 44.590999999999994}, {"type": "mrr_at_1000", "value": 44.651}, {"type": "mrr_at_3", "value": 41.382999999999996}, {"type": "mrr_at_5", "value": 42.838}, {"type": "ndcg_at_1", "value": 32.198}, {"type": "ndcg_at_10", "value": 25.971}, {"type": "ndcg_at_100", "value": 25.112000000000002}, {"type": "ndcg_at_1000", "value": 34.83}, {"type": "ndcg_at_3", "value": 29.018}, {"type": "ndcg_at_5", "value": 28.447}, {"type": "precision_at_1", "value": 34.056}, {"type": "precision_at_10", "value": 19.412}, {"type": "precision_at_100", "value": 7.053}, {"type": "precision_at_1000", "value": 2.061}, {"type": "precision_at_3", "value": 27.761000000000003}, {"type": "precision_at_5", "value": 25.076999999999998}, {"type": "recall_at_1", "value": 4.279}, {"type": "recall_at_10", "value": 12.917000000000002}, {"type": "recall_at_100", "value": 27.386}, {"type": "recall_at_1000", "value": 62.90599999999999}, {"type": "recall_at_3", "value": 7.234999999999999}, {"type": "recall_at_5", "value": 9.866}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NQ", "type": "None", "config": "default", "split": "test", "revision": "b774495ed302d8c44a3a7ea25c90dbce03968f31"}, "metrics": [{"type": "map_at_1", "value": 8.427}, {"type": "map_at_10", "value": 14.471}, {"type": "map_at_100", "value": 15.704}, {"type": "map_at_1000", "value": 15.809000000000001}, {"type": "map_at_3", "value": 12.059000000000001}, {"type": "map_at_5", "value": 13.288}, {"type": "mrr_at_1", "value": 9.647}, {"type": "mrr_at_10", "value": 16.064999999999998}, {"type": "mrr_at_100", "value": 17.212}, {"type": "mrr_at_1000", "value": 17.297}, {"type": "mrr_at_3", "value": 13.562}, {"type": "mrr_at_5", "value": 14.843}, {"type": "ndcg_at_1", "value": 9.647}, {"type": "ndcg_at_10", "value": 18.613}, {"type": "ndcg_at_100", "value": 24.834999999999997}, {"type": "ndcg_at_1000", "value": 27.716}, {"type": "ndcg_at_3", "value": 13.605}, {"type": "ndcg_at_5", "value": 15.797}, {"type": "precision_at_1", "value": 9.647}, {"type": "precision_at_10", "value": 3.531}, {"type": "precision_at_100", "value": 0.7060000000000001}, {"type": "precision_at_1000", "value": 0.098}, {"type": "precision_at_3", "value": 6.431000000000001}, {"type": "precision_at_5", "value": 5.093}, {"type": "recall_at_1", "value": 8.427}, {"type": "recall_at_10", "value": 29.995}, {"type": "recall_at_100", "value": 58.760999999999996}, {"type": "recall_at_1000", "value": 81.033}, {"type": "recall_at_3", "value": 16.621}, {"type": "recall_at_5", "value": 21.69}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB QuoraRetrieval", "type": "None", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 63.709}, {"type": "map_at_10", "value": 76.66}, {"type": "map_at_100", "value": 77.444}, {"type": "map_at_1000", "value": 77.474}, {"type": "map_at_3", "value": 73.639}, {"type": "map_at_5", "value": 75.495}, {"type": "mrr_at_1", "value": 73.42}, {"type": "mrr_at_10", "value": 80.643}, {"type": "mrr_at_100", "value": 80.886}, {"type": "mrr_at_1000", "value": 80.891}, {"type": "mrr_at_3", "value": 79.163}, {"type": "mrr_at_5", "value": 80.132}, {"type": "ndcg_at_1", "value": 73.44000000000001}, {"type": "ndcg_at_10", "value": 81.26100000000001}, {"type": "ndcg_at_100", "value": 83.34}, {"type": "ndcg_at_1000", "value": 83.65599999999999}, {"type": "ndcg_at_3", "value": 77.593}, {"type": "ndcg_at_5", "value": 79.552}, {"type": "precision_at_1", "value": 73.44000000000001}, {"type": "precision_at_10", "value": 12.356}, {"type": "precision_at_100", "value": 1.472}, {"type": "precision_at_1000", "value": 0.155}, {"type": "precision_at_3", "value": 33.733000000000004}, {"type": "precision_at_5", "value": 22.398}, {"type": "recall_at_1", "value": 63.709}, {"type": "recall_at_10", "value": 90.24}, {"type": "recall_at_100", "value": 97.992}, {"type": "recall_at_1000", "value": 99.725}, {"type": "recall_at_3", "value": 79.843}, {"type": "recall_at_5", "value": 85.199}, {"type": "map_at_1", "value": 3.098}, {"type": "map_at_10", "value": 7.359999999999999}, {"type": "map_at_100", "value": 8.888}, {"type": "map_at_1000", "value": 9.158}, {"type": "map_at_3", "value": 5.406}, {"type": "map_at_5", "value": 6.308999999999999}, {"type": "mrr_at_1", "value": 15.2}, {"type": "mrr_at_10", "value": 23.508000000000003}, {"type": "mrr_at_100", "value": 24.709}, {"type": "mrr_at_1000", "value": 24.787}, {"type": "mrr_at_3", "value": 20.383000000000003}, {"type": "mrr_at_5", "value": 22.103}, {"type": "ndcg_at_1", "value": 15.2}, {"type": "ndcg_at_10", "value": 13.174}, {"type": "ndcg_at_100", "value": 19.885}, {"type": "ndcg_at_1000", "value": 25.247999999999998}, {"type": "ndcg_at_3", "value": 12.242}, {"type": "ndcg_at_5", "value": 10.702}, {"type": "precision_at_1", "value": 15.2}, {"type": "precision_at_10", "value": 6.93}, {"type": "precision_at_100", "value": 1.6709999999999998}, {"type": "precision_at_1000", "value": 0.296}, {"type": "precision_at_3", "value": 11.4}, {"type": "precision_at_5", "value": 9.379999999999999}, {"type": "recall_at_1", "value": 3.098}, {"type": "recall_at_10", "value": 14.048}, {"type": "recall_at_100", "value": 33.902}, {"type": "recall_at_1000", "value": 60.17}, {"type": "recall_at_3", "value": 6.9430000000000005}, {"type": "recall_at_5", "value": 9.498}, {"type": "map_at_1", "value": 0.125}, {"type": "map_at_10", "value": 0.86}, {"type": "map_at_100", "value": 4.665}, {"type": "map_at_1000", "value": 11.877}, {"type": "map_at_3", "value": 0.299}, {"type": "map_at_5", "value": 0.47200000000000003}, {"type": "mrr_at_1", "value": 50.0}, {"type": "mrr_at_10", "value": 64.711}, {"type": "mrr_at_100", "value": 65.065}, {"type": "mrr_at_1000", "value": 65.065}, {"type": "mrr_at_3", "value": 62.0}, {"type": "mrr_at_5", "value": 62.9}, {"type": "ndcg_at_1", "value": 43.0}, {"type": "ndcg_at_10", "value": 43.147999999999996}, {"type": "ndcg_at_100", "value": 33.417}, {"type": "ndcg_at_1000", "value": 31.341}, {"type": "ndcg_at_3", "value": 43.653999999999996}, {"type": "ndcg_at_5", "value": 43.21}, {"type": "precision_at_1", "value": 50.0}, {"type": "precision_at_10", "value": 48.199999999999996}, {"type": "precision_at_100", "value": 35.46}, {"type": "precision_at_1000", "value": 15.342}, {"type": "precision_at_3", "value": 48.0}, {"type": "precision_at_5", "value": 47.599999999999994}, {"type": "recall_at_1", "value": 0.125}, {"type": "recall_at_10", "value": 1.145}, {"type": "recall_at_100", "value": 7.727}, {"type": "recall_at_1000", "value": 30.742000000000004}, {"type": "recall_at_3", "value": 0.356}, {"type": "recall_at_5", "value": 0.5780000000000001}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClustering", "type": "None", "config": "default", "split": "test", "revision": "24640382cdbf8abc73003fb0fa6d111a705499eb"}, "metrics": [{"type": "v_measure", "value": 42.214155529412366}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClusteringP2P", "type": "None", "config": "default", "split": "test", "revision": "282350215ef01743dc01b456c7f5241fa8937f16"}, "metrics": [{"type": "v_measure", "value": 48.10171269080449}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB SICK-R", "type": "None", "config": "default", "split": "test", "revision": "a6ea5a8cab320b040a23452cc28066d9beae2cee"}, "metrics": [{"type": "cos_sim_pearson", "value": 76.69196724733715}, {"type": "cos_sim_spearman", "value": 65.00669029968084}, {"type": "euclidean_pearson", "value": 71.35623218354901}, {"type": "euclidean_spearman", "value": 65.00662504036774}, {"type": "manhattan_pearson", "value": 69.46286814034032}, {"type": "manhattan_spearman", "value": 64.05091703970768}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS12", "type": "None", "config": "default", "split": "test", "revision": "a0d554a64d88156834ff5ae9920b964011b16384"}, "metrics": [{"type": "cos_sim_pearson", "value": 75.45675254280496}, {"type": "cos_sim_spearman", "value": 67.48465522195806}, {"type": "euclidean_pearson", "value": 71.932572180082}, {"type": "euclidean_spearman", "value": 67.48597260989263}, {"type": "manhattan_pearson", "value": 70.01381315407934}, {"type": "manhattan_spearman", "value": 66.83129276722313}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS13", "type": "None", "config": "default", "split": "test", "revision": "7e90230a92c190f1bf69ae9002b8cea547a64cca"}, "metrics": [{"type": "cos_sim_pearson", "value": 75.56784955823615}, {"type": "cos_sim_spearman", "value": 77.1656947836492}, {"type": "euclidean_pearson", "value": 76.86159714478943}, {"type": "euclidean_spearman", "value": 77.16570697849755}, {"type": "manhattan_pearson", "value": 77.05983226779968}, {"type": "manhattan_spearman", "value": 77.43229771628044}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS14", "type": "None", "config": "default", "split": "test", "revision": "6031580fec1f6af667f0bd2da0a551cf4f0b2375"}, "metrics": [{"type": "cos_sim_pearson", "value": 77.28801641888653}, {"type": "cos_sim_spearman", "value": 72.72947194978411}, {"type": "euclidean_pearson", "value": 76.2115552769551}, {"type": "euclidean_spearman", "value": 72.72946226092458}, {"type": "manhattan_pearson", "value": 75.19019262864614}, {"type": "manhattan_spearman", "value": 72.18378967267259}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS15", "type": "None", "config": "default", "split": "test", "revision": "ae752c7c21bf194d8b67fd573edf7ae58183cbe3"}, "metrics": [{"type": "cos_sim_pearson", "value": 79.73471725204746}, {"type": "cos_sim_spearman", "value": 80.79015625826382}, {"type": "euclidean_pearson", "value": 80.81110611872813}, {"type": "euclidean_spearman", "value": 80.79016252191039}, {"type": "manhattan_pearson", "value": 79.93979968573043}, {"type": "manhattan_spearman", "value": 80.07556394648903}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS16", "type": "None", "config": "default", "split": "test", "revision": "4d8694f8f0e0100860b497b999b3dbed754a0513"}, "metrics": [{"type": "cos_sim_pearson", "value": 74.47923638473124}, {"type": "cos_sim_spearman", "value": 75.71286196807024}, {"type": "euclidean_pearson", "value": 75.83804880943377}, {"type": "euclidean_spearman", "value": 75.71341236422742}, {"type": "manhattan_pearson", "value": 75.93646913049322}, {"type": "manhattan_spearman", "value": 75.85181752457555}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS17 (en-en)", "type": "None", "config": "en-en", "split": "test", "revision": "af5e6fb845001ecf41f4c1e033ce921939a2a68d"}, "metrics": [{"type": "cos_sim_pearson", "value": 82.62219071209913}, {"type": "cos_sim_spearman", "value": 83.44167690000958}, {"type": "euclidean_pearson", "value": 83.28214784087085}, {"type": "euclidean_spearman", "value": 83.44255138870209}, {"type": "manhattan_pearson", "value": 82.77261607066816}, {"type": "manhattan_spearman", "value": 83.06899474864443}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (en)", "type": "None", "config": "en", "split": "test", "revision": "eea2b4fe26a775864c896887d910b76a8098ad3f"}, "metrics": [{"type": "cos_sim_pearson", "value": 64.70345108985259}, {"type": "cos_sim_spearman", "value": 62.482753044620786}, {"type": "euclidean_pearson", "value": 64.79437494489187}, {"type": "euclidean_spearman", "value": 62.482753044620786}, {"type": "manhattan_pearson", "value": 63.71939825347573}, {"type": "manhattan_spearman", "value": 61.174953862000336}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSBenchmark", "type": "None", "config": "default", "split": "test", "revision": "b0fddb56ed78048fa8b90373c8a3cfc37b684831"}, "metrics": [{"type": "cos_sim_pearson", "value": 76.07865440954043}, {"type": "cos_sim_spearman", "value": 74.54667758834077}, {"type": "euclidean_pearson", "value": 76.48558570428264}, {"type": "euclidean_spearman", "value": 74.54672598094477}, {"type": "manhattan_pearson", "value": 76.06256712227383}, {"type": "manhattan_spearman", "value": 74.42758128821515}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB SciDocsRR", "type": "None", "config": "default", "split": "test", "revision": "d3c5e1fc0b855ab6097bf1cda04dd73947d7caab"}, "metrics": [{"type": "map", "value": 75.15143418949978}, {"type": "mrr", "value": 91.98409705762647}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SciFact", "type": "None", "config": "default", "split": "test", "revision": "0228b52cf27578f30900b9e5271d331663a030d7"}, "metrics": [{"type": "map_at_1", "value": 33.417}, {"type": "map_at_10", "value": 42.594}, {"type": "map_at_100", "value": 43.535000000000004}, {"type": "map_at_1000", "value": 43.6}, {"type": "map_at_3", "value": 39.759}, {"type": "map_at_5", "value": 41.506}, {"type": "mrr_at_1", "value": 35.667}, {"type": "mrr_at_10", "value": 44.446000000000005}, {"type": "mrr_at_100", "value": 45.244}, {"type": "mrr_at_1000", "value": 45.300000000000004}, {"type": "mrr_at_3", "value": 42.167}, {"type": "mrr_at_5", "value": 43.5}, {"type": "ndcg_at_1", "value": 35.667}, {"type": "ndcg_at_10", "value": 47.591}, {"type": "ndcg_at_100", "value": 52.611}, {"type": "ndcg_at_1000", "value": 54.31}, {"type": "ndcg_at_3", "value": 42.356}, {"type": "ndcg_at_5", "value": 45.194}, {"type": "precision_at_1", "value": 35.667}, {"type": "precision_at_10", "value": 6.7669999999999995}, {"type": "precision_at_100", "value": 0.967}, {"type": "precision_at_1000", "value": 0.11100000000000002}, {"type": "precision_at_3", "value": 16.889000000000003}, {"type": "precision_at_5", "value": 11.799999999999999}, {"type": "recall_at_1", "value": 33.417}, {"type": "recall_at_10", "value": 61.260999999999996}, {"type": "recall_at_100", "value": 85.556}, {"type": "recall_at_1000", "value": 98.867}, {"type": "recall_at_3", "value": 47.528}, {"type": "recall_at_5", "value": 54.388999999999996}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB SprintDuplicateQuestions", "type": "None", "config": "default", "split": "test", "revision": "d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46"}, "metrics": [{"type": "cos_sim_accuracy", "value": 99.73267326732673}, {"type": "cos_sim_ap", "value": 92.36951341438333}, {"type": "cos_sim_f1", "value": 86.04073522106309}, {"type": "cos_sim_precision", "value": 85.48864758144127}, {"type": "cos_sim_recall", "value": 86.6}, {"type": "dot_accuracy", "value": 99.73267326732673}, {"type": "dot_ap", "value": 92.36951341438333}, {"type": "dot_f1", "value": 86.04073522106309}, {"type": "dot_precision", "value": 85.48864758144127}, {"type": "dot_recall", "value": 86.6}, {"type": "euclidean_accuracy", "value": 99.73267326732673}, {"type": "euclidean_ap", "value": 92.36951341438333}, {"type": "euclidean_f1", "value": 86.04073522106309}, {"type": "euclidean_precision", "value": 85.48864758144127}, {"type": "euclidean_recall", "value": 86.6}, {"type": "manhattan_accuracy", "value": 99.74455445544554}, {"type": "manhattan_ap", "value": 92.96894184904977}, {"type": "manhattan_f1", "value": 86.8917576961271}, {"type": "manhattan_precision", "value": 86.29191321499013}, {"type": "manhattan_recall", "value": 87.5}, {"type": "max_accuracy", "value": 99.74455445544554}, {"type": "max_ap", "value": 92.96894184904977}, {"type": "max_f1", "value": 86.8917576961271}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClustering", "type": "None", "config": "default", "split": "test", "revision": "6cbc1f7b2bc0622f2e39d2c77fa502909748c259"}, "metrics": [{"type": "v_measure", "value": 45.349940718460374}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClusteringP2P", "type": "None", "config": "default", "split": "test", "revision": "815ca46b2622cec33ccafc3735d572c266efdb44"}, "metrics": [{"type": "v_measure", "value": 31.266631844140036}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB StackOverflowDupQuestions", "type": "None", "config": "default", "split": "test", "revision": "e185fbe320c72810689fc5848eb6114e1ef5ec69"}, "metrics": [{"type": "map", "value": 42.02550203348626}, {"type": "mrr", "value": 42.442651302945414}]}, {"task": {"type": "Summarization"}, "dataset": {"name": "MTEB SummEval", "type": "None", "config": "default", "split": "test", "revision": "cda12ad7615edc362dbf25a00fdd61d3b1eaf93c"}, "metrics": [{"type": "cos_sim_pearson", "value": 30.22842420698354}, {"type": "cos_sim_spearman", "value": 30.568909812744543}, {"type": "dot_pearson", "value": 30.228424144316747}, {"type": "dot_spearman", "value": 30.619692862283827}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB Touche2020", "type": "None", "config": "default", "split": "test", "revision": "a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f"}, "metrics": [{"type": "map_at_1", "value": 1.585}, {"type": "map_at_10", "value": 7.398000000000001}, {"type": "map_at_100", "value": 13.603000000000002}, {"type": "map_at_1000", "value": 15.267}, {"type": "map_at_3", "value": 3.857}, {"type": "map_at_5", "value": 5.509}, {"type": "mrr_at_1", "value": 24.490000000000002}, {"type": "mrr_at_10", "value": 39.883}, {"type": "mrr_at_100", "value": 41.082}, {"type": "mrr_at_1000", "value": 41.082}, {"type": "mrr_at_3", "value": 35.034}, {"type": "mrr_at_5", "value": 37.483}, {"type": "ndcg_at_1", "value": 23.469}, {"type": "ndcg_at_10", "value": 21.221999999999998}, {"type": "ndcg_at_100", "value": 34.851}, {"type": "ndcg_at_1000", "value": 46.26}, {"type": "ndcg_at_3", "value": 21.906}, {"type": "ndcg_at_5", "value": 21.229}, {"type": "precision_at_1", "value": 24.490000000000002}, {"type": "precision_at_10", "value": 19.796}, {"type": "precision_at_100", "value": 8.122}, {"type": "precision_at_1000", "value": 1.541}, {"type": "precision_at_3", "value": 23.810000000000002}, {"type": "precision_at_5", "value": 22.041}, {"type": "recall_at_1", "value": 1.585}, {"type": "recall_at_10", "value": 13.664000000000001}, {"type": "recall_at_100", "value": 49.559}, {"type": "recall_at_1000", "value": 83.978}, {"type": "recall_at_3", "value": 5.088}, {"type": "recall_at_5", "value": 8.203000000000001}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ToxicConversationsClassification", "type": "None", "config": "default", "split": "test", "revision": "d7c0de2777da35d6aae2200a62c6e0e5af397c4c"}, "metrics": [{"type": "accuracy", "value": 71.68520000000001}, {"type": "ap", "value": 14.622321024533974}, {"type": "f1", "value": 55.1924859473184}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB TweetSentimentExtractionClassification", "type": "None", "config": "default", "split": "test", "revision": "d604517c81ca91fe16a244d1248fc021f9ecee7a"}, "metrics": [{"type": "accuracy", "value": 53.34748160724392}, {"type": "f1", "value": 53.518629300332755}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB TwentyNewsgroupsClustering", "type": "None", "config": "default", "split": "test", "revision": "6125ec4e24fa026cec8a478383ee943acfbd5449"}, "metrics": [{"type": "v_measure", "value": 40.22582442073446}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterSemEval2015", "type": "None", "config": "default", "split": "test", "revision": "70970daeab8776df92f5ea462b6173c0b46fd2d1"}, "metrics": [{"type": "cos_sim_accuracy", "value": 82.46408773916671}, {"type": "cos_sim_ap", "value": 60.57612839124909}, {"type": "cos_sim_f1", "value": 58.366606170598914}, {"type": "cos_sim_precision", "value": 53.899441340782126}, {"type": "cos_sim_recall", "value": 63.641160949868066}, {"type": "dot_accuracy", "value": 82.46408773916671}, {"type": "dot_ap", "value": 60.57612839124909}, {"type": "dot_f1", "value": 58.366606170598914}, {"type": "dot_precision", "value": 53.899441340782126}, {"type": "dot_recall", "value": 63.641160949868066}, {"type": "euclidean_accuracy", "value": 82.46408773916671}, {"type": "euclidean_ap", "value": 60.57612839124909}, {"type": "euclidean_f1", "value": 58.366606170598914}, {"type": "euclidean_precision", "value": 53.899441340782126}, {"type": "euclidean_recall", "value": 63.641160949868066}, {"type": "manhattan_accuracy", "value": 81.68921738093819}, {"type": "manhattan_ap", "value": 58.62502289564927}, {"type": "manhattan_f1", "value": 57.40318906605921}, {"type": "manhattan_precision", "value": 50.50100200400801}, {"type": "manhattan_recall", "value": 66.49076517150397}, {"type": "max_accuracy", "value": 82.46408773916671}, {"type": "max_ap", "value": 60.57612839124909}, {"type": "max_f1", "value": 58.366606170598914}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterURLCorpus", "type": "None", "config": "default", "split": "test", "revision": "8b6510b0b1fa4e4c4f879467980e9be563ec1cdf"}, "metrics": [{"type": "cos_sim_accuracy", "value": 86.89602980556526}, {"type": "cos_sim_ap", "value": 81.92992391915341}, {"type": "cos_sim_f1", "value": 74.31139877741819}, {"type": "cos_sim_precision", "value": 69.71393873971124}, {"type": "cos_sim_recall", "value": 79.55805358792732}, {"type": "dot_accuracy", "value": 86.89602980556526}, {"type": "dot_ap", "value": 81.92992407440505}, {"type": "dot_f1", "value": 74.31139877741819}, {"type": "dot_precision", "value": 69.71393873971124}, {"type": "dot_recall", "value": 79.55805358792732}, {"type": "euclidean_accuracy", "value": 86.89602980556526}, {"type": "euclidean_ap", "value": 81.92992329073074}, {"type": "euclidean_f1", "value": 74.31139877741819}, {"type": "euclidean_precision", "value": 69.71393873971124}, {"type": "euclidean_recall", "value": 79.55805358792732}, {"type": "manhattan_accuracy", "value": 86.94454146776886}, {"type": "manhattan_ap", "value": 81.96535237136042}, {"type": "manhattan_f1", "value": 74.41181834761991}, {"type": "manhattan_precision", "value": 70.70076939072572}, {"type": "manhattan_recall", "value": 78.53403141361257}, {"type": "max_accuracy", "value": 86.94454146776886}, {"type": "max_ap", "value": 81.96535237136042}, {"type": "max_f1", "value": 74.41181834761991}]}]}]} |
|
Cloyne/vietnamese-sbert | Cloyne | sentence-similarity | [
"sentence-transformers",
"safetensors",
"roberta",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:120210",
"loss:MultipleNegativesRankingLoss",
"arxiv:1908.10084",
"arxiv:1705.00652",
"base_model:keepitreal/vietnamese-sbert",
"base_model:finetune:keepitreal/vietnamese-sbert",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| 2024-10-28T14:49:39 | 2024-10-28T14:49:54 | 157 | 0 | ---
base_model: keepitreal/vietnamese-sbert
library_name: sentence-transformers
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:120210
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: Chủ tịch Ủy ban nhân dân xã có quyền ra quyết định cưỡng chế tháo
dỡ công trình xây dựng trên đất nông nghiệp khi chưa chuyển mục đích sử dụng đất
hay không?
sentences:
- 'Đối tượng, điều kiện kéo dài tuổi phục vụ tại ngũ
1. Đối tượng:
a) Quân nhân chuyên nghiệp có trình độ cao đẳng trở lên đang đảm nhiệm các chức
danh: Kỹ thuật viên, Nhân viên Kỹ thuật, Huấn luyện viên, Nghệ sĩ, Nhạc sĩ, Diễn
viên làm việc đúng chuyên ngành đào tạo ở các cơ sở nghiên cứu, nhà trường, bệnh
viện, trung tâm thể dục thể thao, đoàn nghệ thuật, nhà máy, doanh nghiệp quốc
phòng; đơn vị đóng quân ở địa bàn vùng sâu, vùng xa, biên giới, hải đảo.
b) Quân nhân chuyên nghiệp đang làm việc thuộc các chuyên ngành hẹp được đào tạo
công phu hoặc chuyên ngành Quân đội chưa đào tạo được; thợ bậc cao.
c) Quân nhân chuyên nghiệp đang đảm nhiệm chức vụ chỉ huy, quản lý ở các nhà máy,
doanh nghiệp quốc phòng.
d) Quân nhân chuyên nghiệp không thuộc đối tượng quy định tại điểm a, điểm b,
điểm c khoản này do Bộ trưởng Bộ Quốc phòng quyết định.
2. Điều kiện:
Quân nhân chuyên nghiệp thuộc đối tượng quy định tại khoản 1 Điều này được kéo
dài tuổi phục vụ tại ngũ khi có đủ các điều kiện sau:
a) Đơn vị có biên chế và nhu cầu sử dụng;
b) Hết hạn tuổi phục vụ tại ngũ cao nhất theo cấp bậc quân hàm quy định tại khoản
2 Điều 17 Luật Quân nhân chuyên nghiệp, công nhân và viên chức quốc phòng; chưa
có người thay thế; tự nguyện tiếp tục phục vụ tại ngũ;
c) Có đủ phẩm chất chính trị, đạo đức, sức khỏe để hoàn thành nhiệm vụ được giao;
d) Có trình độ chuyên môn kỹ thuật, nghiệp vụ giỏi; tay nghề cao; chất lượng,
hiệu quả công tác tốt.'
- 'Thi hành quyết định cưỡng chế
1. Người ra quyết định cưỡng chế có trách nhiệm gửi ngay quyết định cưỡng chế
cho các cá nhân, tổ chức liên quan và tổ chức thực hiện việc cưỡng chế thi hành
quyết định xử phạt của mình và của cấp dưới.
..."'
- 'Trình tự, thủ tục đăng ký tài khoản định danh điện tử đối với công dân Việt Nam
1. Đăng ký tài khoản định danh điện tử mức độ 1 qua ứng dụng VNelD đối với công
dân đã có thẻ Căn cước công dân gắn chíp điện tử
a) Công dân sử dụng thiết bị di động tải và cài đặt ứng dụng VNelD.
b) Công dân sử dụng ứng dụng VNelD để nhập thông tin về số định danh cá nhân và
số điện thoại hoặc địa chỉ thư điện tử; cung cấp các thông tin theo hướng dẫn
trên ứng dụng VNelD; thu nhận ảnh chân dung bằng thiết bị di động và gửi yêu cầu
đề nghị cấp tài khoản định danh điện tử tới cơ quan quản lý định danh và xác thực
điện tử qua ứng dụng VNelD.
c) Cơ quan quản lý định danh điện tử thông báo kết quả đăng ký tài khoản qua ứng
dụng VNelD hoặc tin nhắn SMS hoặc địa chỉ thư điện tử.
2. Đăng ký tài khoản định danh điện tử mức độ 2
a) Đối với công dân đã được cấp thẻ Căn cước công dân gắn chíp điện tử:
Công dân đến Công an xã, phường, thị trấn hoặc nơi làm thủ tục cấp thẻ Căn cước
công dân để làm thủ tục cấp tài khoản định danh điện tử. Công dân xuất trình thẻ
Căn cước công dân gắn chíp điện tử, cung cấp thông tin về số điện thoại hoặc địa
chỉ thư điện tử và đề nghị bổ sung thông tin được tích hợp vào tài khoản định
danh điện tử.
Cán bộ tiếp nhận nhập thông tin công dân cung cấp vào hệ thống định danh và xác
thực điện tử; chụp ảnh chân dung, thu nhận vân tay của công dân đến làm thủ tục
để xác thực với Cơ sở dữ liệu căn cước công dân và khẳng định sự đồng ý đăng ký
tạo lập tài khoản định danh điện tử.
Cơ quan quản lý định danh điện tử thông báo kết quả đăng ký tài khoản qua ứng
dụng VNelD hoặc tin nhắn SMS hoặc địa chỉ thư điện tử.
b) Cơ quan Công an tiến hành cấp tài khoản định danh điện tử mức độ 2 cùng với
cấp thẻ Căn cước công dân với trường hợp công dân chưa được cấp Căn cước công
dân gắn chíp điện tử.'
- source_sentence: Mức hưởng chế độ thai sản đối với lao động nam là người nước ngoài
được pháp luật quy định như thế nào?
sentences:
- '"Điều 21. Thông báo kết quả và xác nhận nhập học
1. Cơ sở đào tạo gửi giấy báo trúng tuyển cho những thí sinh trúng tuyển, trong
đó ghi rõ những thủ tục cần thiết đối với thí sinh khi nhập học và phương thức
nhập học của thí sinh.
2. Thí sinh xác nhận nhập học bằng hình thức trực tuyến trên hệ thống, trước khi
nhập học tại cơ sở đào tạo.
3. Đối với những thí sinh không xác nhận nhập học trong thời hạn quy định:
a) Nếu không có lý do chính đáng thì coi như thí sinh từ chối nhập học và cơ sở
đào tạo có quyền không tiếp nhận;
b) Nếu do ốm đau, tai nạn, có giấy xác nhận của bệnh viện quận, huyện trở lên
hoặc do thiên tai có xác nhận của UBND quận, huyện trở lên, cơ sở đào tạo xem
xét quyết định tiếp nhận thí sinh vào học hoặc bảo lưu kết quả tuyển sinh để thí
sinh vào học sau;
c) Nếu do sai sót, nhầm lẫn của cán bộ thực hiện công tác tuyển sinh hoặc cá nhân
thí sinh gây ra, cơ sở đào tạo chủ động phối hợp với các cá nhân, tổ chức liên
quan xem xét các minh chứng và quyết định việc tiếp nhận thí sinh vào học hoặc
bảo lưu kết quả tuyển sinh để thí sinh vào học sau.
4. Thí sinh đã xác nhận nhập học tại một cơ sở đào tạo không được tham gia xét
tuyển ở nơi khác hoặc ở các đợt xét tuyển bổ sung, trừ trường hợp được cơ sở đào
tạo cho phép."'
- 'Tổ chức, nhiệm vụ, quyền hạn của Ban Chỉ huy
...
2. Nhiệm vụ, quyền hạn của Ban Chỉ huy:
a) Chỉ đạo xây dựng, ban hành quy định về công tác bảo đảm an toàn PCCC và CNCH
tại Trụ sở cơ quan Bộ Tư pháp.
b) Hướng dẫn, phối hợp với các đơn vị thuộc Bộ và chỉ đạo Đội PCCC và CNCH cơ
sở tổ chức tuyên truyền, bồi dưỡng nghiệp vụ PCCC và CNCH.
c) Chỉ đạo Đội PCCC và CNCH cơ sở tại Trụ sở cơ quan Bộ Tư pháp xây dựng, trình
cấp có thẩm quyền phê duyệt và tổ chức thực tập phương án PCCC, phương án CNCH.
d) Chỉ đạo Đội PCCC và CNCH cơ sở tại Trụ sở cơ quan Bộ Tư pháp quản lý các trang
thiết bị PCCC và CNCH.
đ) Chỉ đạo chữa cháy, CNCH khi xảy ra cháy, sự cố, tai nạn tại Trụ sở cơ quan
Bộ Tư pháp.
e) Chỉ đạo việc tổ chức lập và lưu giữ hồ sơ quản lý, theo dõi hoạt động PCCC,
CNCH tại Trụ sở cơ quan Bộ Tư pháp.
g) Chỉ đạo việc sơ kết, tổng kết các hoạt động về PCCC và CNCH của cơ quan; kiểm
tra, đôn đốc việc chấp hành các quy định về PCCC và CNCH.
h) Đề xuất việc khen thưởng, kỷ luật các tập thể, cá nhân trong việc thực hiện
công tác PCCC, CNCH.
i) Chỉ đạo Đội PCCC và CNCH cơ sở dự trù kinh phí cho các hoạt động PCCC và CNCH
tại Trụ sở cơ quan Bộ Tư pháp.
k) Thực hiện các nhiệm vụ khác do Bộ trưởng giao và theo quy định của pháp luật.'
- 'Mức hưởng chế độ thai sản
...
b) Mức hưởng một ngày đối với trường hợp quy định tại Điều 32 và khoản 2 Điều
34 của Luật này được tính bằng mức hưởng chế độ thai sản theo tháng chia cho 24
ngày.'
- source_sentence: Doanh nghiệp được áp dụng chế độ ưu tiên không cung cấp báo cáo
kiểm toán đúng thời hạn bị phạt bao nhiêu tiền?
sentences:
- 'Thay đổi Thẩm phán, Hội thẩm
1. Thẩm phán, Hội thẩm phải từ chối tham gia xét xử hoặc bị thay đổi khi thuộc
một trong các trường hợp:
a) Trường hợp quy định tại Điều 49 của Bộ luật này;
b) Họ cùng trong một Hội đồng xét xử và là người thân thích với nhau;
c) Đã tham gia xét xử sơ thẩm hoặc phúc thẩm hoặc tiến hành tố tụng vụ án đó với
tư cách là Điều tra viên, Cán bộ điều tra, Kiểm sát viên, Kiểm tra viên, Thẩm
tra viên, Thư ký Tòa án.
2. Việc thay đổi Thẩm phán, Hội thẩm trước khi mở phiên tòa do Chánh án hoặc Phó
Chánh án Tòa án được phân công giải quyết vụ án quyết định.
Thẩm phán bị thay đổi là Chánh án Tòa án thì do Chánh án Tòa án trên một cấp quyết
định.
Việc thay đổi Thẩm phán, Hội thẩm tại phiên tòa do Hội đồng xét xử quyết định
trước khi bắt đầu xét hỏi bằng cách biểu quyết tại phòng nghị án. Khi xem xét
thay đổi thành viên nào thì thành viên đó được trình bày ý kiến của mình, Hội
đồng quyết định theo đa số.
Trường hợp phải thay đổi Thẩm phán, Hội thẩm tại phiên tòa thì Hội đồng xét xử
ra quyết định hoãn phiên tòa.'
- '“Điều 21. Chấm dứt hưởng trợ cấp thất nghiệp
1. Các trường hợp người lao động đang hưởng trợ cấp thất nghiệp bị chấm dứt hưởng
trợ cấp thất nghiệp được quy định như sau:
e) Trong thời gian hưởng trợ cấp thất nghiệp, 03 tháng liên tục không thực hiện
thông báo hằng tháng về việc tìm kiếm việc làm với trung tâm dịch vụ việc làm
theo quy định
Ngày mà người lao động được xác định bị chấm dứt hưởng trợ cấp thất nghiệp là
ngày kết thúc của thời hạn thông báo tìm kiếm việc làm của tháng thứ 3 liên tục
mà người lao động không thực hiện thông báo hằng tháng về việc tìm kiếm việc làm."'
- 'Vi phạm quy định về thời hạn làm thủ tục hải quan, nộp hồ sơ thuế
...
2. Phạt tiền từ 1.000.000 đồng đến 2.000.000 đồng đối với hành vi không thực hiện
đúng thời hạn quy định thuộc một trong các trường hợp sau:
a) Cung cấp báo cáo kiểm toán, báo cáo tài chính của doanh nghiệp được áp dụng
chế độ ưu tiên;
b) Thông báo cho cơ quan hải quan quyết định xử lý vi phạm pháp luật về quản lý
thuế, kế toán đối với doanh nghiệp được áp dụng chế độ ưu tiên;
c) Báo cáo về lượng hàng hóa nhập khẩu phục vụ xây dựng nhà xưởng, hàng hóa gửi
kho bên ngoài của doanh nghiệp chế xuất;
d) Báo cáo về lượng hàng hóa trung chuyển đưa vào, đưa ra, còn lưu tại cảng;
đ) Báo cáo thống kê thông quan hàng bưu chính đưa vào Việt Nam để chuyển tiếp
đi quốc tế.
...'
- source_sentence: Tài chính của Hội Kiểm toán viên hành nghề Việt Nam được chi cho
những khoản nào?
sentences:
- 'Giải thể và xử lý tài chính khi giải thể
1. Khi xét thấy hoạt động của Hội không có hiệu quả, không mang lại lợi ích cho
Hội viên hoặc gây phiền hà, cản trở cho Hội viên thì BCH Hội quyết định triệu
tập Đại hội để bàn biện pháp củng cố tổ chức hoặc giải thể Hội. Nếu giải thể Hội
thì do Đại hội đại biểu hoặc Đại hội toàn quốc của Hội thông qua và đề nghị cơ
quan Nhà nước có thẩm quyền xem xét, quyết định.
2. Khi Hội bị giải thể, Ban Thường trực và Ban Kiểm tra của Hội phải tiến hành
kiểm kê tài sản, kiểm quỹ và báo cáo BCH Hội quyết định việc xử lý tài sản, tiền
tồn quỹ và tiến hành thủ tục giải thể theo quy định của pháp luật.'
- '"Điều 14. Miễn trừ đối với thỏa thuận hạn chế cạnh tranh bị cấm
1. Thỏa thuận hạn chế cạnh tranh quy định tại các khoản 1, 2, 3, 7, 8, 9, 10 và
11 Điều 11 bị cấm theo quy định tại Điều 12 của Luật này được miễn trừ có thời
hạn nếu có lợi cho người tiêu dùng và đáp ứng một trong các điều kiện sau đây:
a) Tác động thúc đẩy tiến bộ kỹ thuật, công nghệ, nâng cao chất lượng hàng hóa,
dịch vụ;
b) Tăng cường sức cạnh tranh của doanh nghiệp Việt Nam trên thị trường quốc tế;
c) Thúc đẩy việc áp dụng thống nhất tiêu chuẩn chất lượng, định mức kỹ thuật của
chủng loại sản phẩm;
d) Thống nhất các điều kiện thực hiện hợp đồng, giao hàng, thanh toán nhưng không
liên quan đến giá và các yếu tố của giá.
2. Thỏa thuận lao động, thỏa thuận hợp tác trong các ngành, lĩnh vực đặc thù được
thực hiện theo quy định của luật khác thì thực hiện theo quy định của luật đó".'
- '"Điều 2. Sửa đổi, bổ sung một số điều của Nghị định số 15/2019/NĐ-CP ngày 01
tháng 02 năm 2019 của Chính phủ quy định chi tiết một số điều và biện pháp thi
hành Luật Giáo dục nghề nghiệp
...
12. Sửa đổi, bổ sung Điều 24 như sau:
Điều 24. Thẩm quyền cấp giấy chứng nhận đăng ký hoạt động liên kết đào tạo với
nước ngoài
1. Tổng cục Giáo dục nghề nghiệp cấp giấy chứng nhận đăng ký hoạt động liên kết
đào tạo với nước ngoài đối với trường cao đẳng.
2. Sở Lao động - Thương binh và Xã hội nơi trường trung cấp, trung tâm giáo dục
nghề nghiệp, trung tâm giáo dục nghề nghiệp - giáo dục thường xuyên và doanh nghiệp
tổ chức hoạt động liên kết đào tạo với nước ngoài cấp giấy chứng nhận đăng ký
hoạt động liên kết đào tạo với nước ngoài đối với trường trung cấp, trung tâm
giáo dục nghề nghiệp, trung tâm giáo dục nghề nghiệp - giáo dục thường xuyên và
doanh nghiệp."'
- source_sentence: NLĐ ký nhiều hợp đồng lao động thì đóng BHYT như thế nào?
sentences:
- 'Hồ sơ, thủ tục xác định trường hợp được bồi thường
[...]
3. Trong thời hạn 05 ngày làm việc, kể từ ngày nhận được đơn và các giấy tờ hợp
lệ, nếu xác định yêu cầu thuộc trách nhiệm giải quyết của mình thì Sở Y tế phải
thụ lý và thông báo bằng văn bản về việc thụ lý đơn cho người bị thiệt hại hoặc
thân nhân của người bị thiệt hại (sau đây gọi tắt là người bị thiệt hại). Trường
hợp hồ sơ không đầy đủ thì Sở Y tế có văn bản hướng dẫn người bị thiệt hại bổ
sung.
4. Trong thời hạn 15 ngày, kể từ ngày nhận được đơn yêu cầu của người bị thiệt
hại, Sở Y tế phải hoàn thành việc xác định nguyên nhân gây tai biến, mức độ tổn
thương và thông báo bằng văn bản cho người yêu cầu đồng thời báo cáo Bộ Y tế.'
- 'Chuyển nhượng quyền thăm dò khoáng sản
1. Tổ chức, cá nhân nhận chuyển nhượng quyền thăm dò khoáng sản phải có đủ điều
kiện để được cấp Giấy phép thăm dò khoáng sản theo quy định của Luật này.
2. Việc chuyển nhượng quyền thăm dò khoáng sản phải được cơ quan quản lý nhà nước
có thẩm quyền cấp Giấy phép thăm dò khoáng sản chấp thuận; trường hợp được chấp
thuận, tổ chức, cá nhân nhận chuyển nhượng quyền thăm dò khoáng sản được cấp Giấy
phép thăm dò khoáng sản mới.
3. Tổ chức, cá nhân chuyển nhượng quyền thăm dò khoáng sản đã thực hiện được ít
nhất 50% dự toán của đề án thăm dò khoáng sản.
4. Chính phủ quy định chi tiết việc chuyển nhượng quyền thăm dò khoáng sản.'
- '"Sửa đổi, bổ sung một số điều của Luật bảo hiểm y tế:
...
6. Sửa đổi, bổ sung Điều 12 như sau:
“Điều 12. Đối tượng tham gia bảo hiểm y tế
1. Nhóm do người lao động và người sử dụng lao động đóng, bao gồm:
a) Người lao động làm việc theo hợp đồng lao động không xác định thời hạn, hợp
đồng lao động có thời hạn từ đủ 3 tháng trở lên; người lao động là người quản
lý doanh nghiệp hưởng tiền lương; cán bộ, công chức, viên chức (sau đây gọi chung
là người lao động);
b) Người hoạt động không chuyên trách ở xã, phường, thị trấn theo quy định của
pháp luật.=
...
4. Nhóm được ngân sách nhà nước hỗ trợ mức đóng, bao gồm:
a) Người thuộc hộ gia đình cận nghèo;
b) Học sinh, sinh viên.
5. Nhóm tham gia bảo hiểm y tế theo hộ gia đình gồm những người thuộc hộ gia đình,
trừ đối tượng quy định tại các khoản 1, 2, 3 và 4 Điều này.
6. Chính phủ quy định các đối tượng khác ngoài các đối tượng quy định tại các
khoản 3, 4 và 5 Điều này; quy định việc cấp thẻ bảo hiểm y tế đối với đối tượng
do Bộ Quốc phòng, Bộ Công an quản lý và đối tượng quy định tại điểm 1 khoản 3
Điều này; quy định lộ trình thực hiện bảo hiểm y tế, phạm vi quyền lợi, mức hưởng
bảo hiểm y tế, khám bệnh, chữa bệnh bảo hiểm y tế, quản lý, sử dụng phần kinh
phí dành cho khám bệnh, chữa bệnh bảo hiểm y tế, giám định bảo hiểm y tế, thanh
toán, quyết toán bảo hiểm y tế đối với các đối tượng quy định tại điểm a khoản
3 Điều này.”'
---
# SentenceTransformer based on keepitreal/vietnamese-sbert
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [keepitreal/vietnamese-sbert](https://huggingface.co/keepitreal/vietnamese-sbert) on the csv dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [keepitreal/vietnamese-sbert](https://huggingface.co/keepitreal/vietnamese-sbert) <!-- at revision a9467ef2ef47caa6448edeabfd8e5e5ce0fa2a23 -->
- **Maximum Sequence Length:** 256 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- csv
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: RobertaModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Cloyne/vietnamese-embedding_finetuned")
# Run inference
sentences = [
'NLĐ ký nhiều hợp đồng lao động thì đóng BHYT như thế nào?',
'"Sửa đổi, bổ sung một số điều của Luật bảo hiểm y tế:\n...\n6. Sửa đổi, bổ sung Điều 12 như sau:\n“Điều 12. Đối tượng tham gia bảo hiểm y tế\n1. Nhóm do người lao động và người sử dụng lao động đóng, bao gồm:\na) Người lao động làm việc theo hợp đồng lao động không xác định thời hạn, hợp đồng lao động có thời hạn từ đủ 3 tháng trở lên; người lao động là người quản lý doanh nghiệp hưởng tiền lương; cán bộ, công chức, viên chức (sau đây gọi chung là người lao động);\nb) Người hoạt động không chuyên trách ở xã, phường, thị trấn theo quy định của pháp luật.=\n...\n4. Nhóm được ngân sách nhà nước hỗ trợ mức đóng, bao gồm:\na) Người thuộc hộ gia đình cận nghèo;\nb) Học sinh, sinh viên.\n5. Nhóm tham gia bảo hiểm y tế theo hộ gia đình gồm những người thuộc hộ gia đình, trừ đối tượng quy định tại các khoản 1, 2, 3 và 4 Điều này.\n6. Chính phủ quy định các đối tượng khác ngoài các đối tượng quy định tại các khoản 3, 4 và 5 Điều này; quy định việc cấp thẻ bảo hiểm y tế đối với đối tượng do Bộ Quốc phòng, Bộ Công an quản lý và đối tượng quy định tại điểm 1 khoản 3 Điều này; quy định lộ trình thực hiện bảo hiểm y tế, phạm vi quyền lợi, mức hưởng bảo hiểm y tế, khám bệnh, chữa bệnh bảo hiểm y tế, quản lý, sử dụng phần kinh phí dành cho khám bệnh, chữa bệnh bảo hiểm y tế, giám định bảo hiểm y tế, thanh toán, quyết toán bảo hiểm y tế đối với các đối tượng quy định tại điểm a khoản 3 Điều này.”',
'Hồ sơ, thủ tục xác định trường hợp được bồi thường\n[...]\n3. Trong thời hạn 05 ngày làm việc, kể từ ngày nhận được đơn và các giấy tờ hợp lệ, nếu xác định yêu cầu thuộc trách nhiệm giải quyết của mình thì Sở Y tế phải thụ lý và thông báo bằng văn bản về việc thụ lý đơn cho người bị thiệt hại hoặc thân nhân của người bị thiệt hại (sau đây gọi tắt là người bị thiệt hại). Trường hợp hồ sơ không đầy đủ thì Sở Y tế có văn bản hướng dẫn người bị thiệt hại bổ sung.\n4. Trong thời hạn 15 ngày, kể từ ngày nhận được đơn yêu cầu của người bị thiệt hại, Sở Y tế phải hoàn thành việc xác định nguyên nhân gây tai biến, mức độ tổn thương và thông báo bằng văn bản cho người yêu cầu đồng thời báo cáo Bộ Y tế.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### csv
* Dataset: csv
* Size: 120,210 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 8 tokens</li><li>mean: 25.08 tokens</li><li>max: 49 tokens</li></ul> | <ul><li>min: 21 tokens</li><li>mean: 206.98 tokens</li><li>max: 256 tokens</li></ul> |
* Samples:
| anchor | positive |
|:--------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Nội dung lồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật được quy định thế nào?</code> | <code>Nội dung lồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật<br>Trong phạm vi điều chỉnh của văn bản quy phạm pháp luật:<br>1. Xác định nội dung liên quan đến vấn đề bình đẳng giới hoặc vấn đề bất bình đẳng giới, phân biệt đối xử về giới.<br>2. Quy định các biện pháp cần thiết để thực hiện bình đẳng giới hoặc để giải quyết vấn đề bất bình đẳng giới, phân biệt đối xử về giới; dự báo tác động của các quy định đó đối với nam và nữ sau khi được ban hành.<br>3. Xác định nguồn nhân lực, tài chính cần thiết để triển khai các biện pháp thực hiện bình đẳng giới hoặc để giải quyết vấn đề bất bình đẳng giới, phân biệt đối xử về giới.</code> |
| <code>Điều kiện để giáo viên trong cơ sở giáo dục mầm non, tiểu học ngoài công lập bị ảnh hưởng bởi Covid-19 được hưởng chính sách hỗ trợ là gì?</code> | <code>Điều kiện được hưởng<br>Cán bộ quản lý, giáo viên, nhân viên được hưởng chính sách khi bảo đảm các điều kiện sau:<br>1. Là người đang làm việc tại cơ sở giáo dục ngoài công lập trước khi cơ sở phải tạm dừng hoạt động theo yêu cầu của cơ quan nhà nước có thẩm quyền để phòng, chống dịch COVID-19 tính từ ngày 01 tháng 5 năm 2021 đến hết ngày 31 tháng 12 năm 2021.<br>2. Nghỉ việc không hưởng lương từ 01 tháng trở lên tính từ ngày 01 tháng 5 năm 2021 đến hết ngày 31 tháng 12 năm 2021.<br>3. Chưa được hưởng chính sách hỗ trợ đối với người lao động tạm hoãn hợp đồng lao động, nghỉ việc không hưởng lương theo quy định tại khoản 4, khoản 5, khoản 6 Mục II Nghị quyết số 68/NQ-CP ngày 01 tháng 7 năm 2021 của Chính phủ về một số chính sách hỗ trợ người lao động và người sử dụng lao động gặp khó khăn do đại dịch COVID-19, Nghị quyết số 126/NQ-CP ngày 08 tháng 10 năm 2021 của Chính phủ sửa đổi, bổ sung Nghị quyết số 68/NQ-CP ngày 01 tháng 7 năm 2021 của Chính phủ về một số chính sách hỗ trợ người lao động và người sử dụng lao động gặp khó khăn do đại dịch COVID-19 (sau đây gọi tắt là Nghị quyết số 68/NQ-CP) do không tham gia Bảo hiểm xã hội bắt buộc.<br>4. Có xác nhận làm việc tại cơ sở giáo dục ngoài công lập ít nhất hết năm học 2021 - 2022 theo kế hoạch năm học của địa phương, bao gồm cơ sở giáo dục ngoài công lập đã làm việc trước đây hoặc cơ sở giáo dục ngoài công lập khác trong trường hợp cơ sở giáo dục ngoài công lập trước đây làm việc không hoạt động trở lại.</code> |
| <code>Nguyên tắc áp dụng phụ cấp ưu đãi nghề y tế thế nào?</code> | <code>Nguyên tắc áp dụng<br>1. Trường hợp công chức, viên chức chuyên môn y tế thuộc đối tượng được hưởng các mức phụ cấp ưu đãi theo nghề khác nhau thì được hưởng một mức phụ cấp ưu đãi theo nghề cao nhất.<br>2. Công chức, viên chức đã hưởng phụ cấp ưu đãi theo nghề quy định tại Thông tư liên tịch số 06/2010/TTLT-BYT-BNV-BTC ngày 22/3/2010 của Bộ Y tế, Bộ Nội vụ, Bộ Tài chính hướng dẫn thực hiện Nghị định số 64/2009/NĐ-CP ngày 30/7/2009 của Chính phủ về chính sách đối với cán bộ, viên chức y tế công tác ở vùng có điều kiện kinh tế - xã hội đặc biệt khó khăn thì không hưởng phụ cấp ưu đãi theo nghề quy định tại Thông tư liên tịch này.</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Evaluation Dataset
#### train
* Dataset: train
* Size: 13,357 evaluation samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 7 tokens</li><li>mean: 24.61 tokens</li><li>max: 51 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 202.71 tokens</li><li>max: 256 tokens</li></ul> |
* Samples:
| anchor | positive |
|:-------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Toà án cấp nào có thẩm quyền giải quyết việc đòi tài sản đã cho người khác vay theo hợp đồng cho vay?</code> | <code>"Điều 35. Thẩm quyền của Tòa án nhân dân cấp huyện<br>1. Tòa án nhân dân cấp huyện có thẩm quyền giải quyết theo thủ tục sơ thẩm những tranh chấp sau đây:<br>a) Tranh chấp về dân sự, hôn nhân và gia đình quy định tại Điều 26 và Điều 28 của Bộ luật này, trừ tranh chấp quy định tại khoản 7 Điều 26 của Bộ luật này;<br>b) Tranh chấp về kinh doanh, thương mại quy định tại khoản 1 Điều 30 của Bộ luật này;<br>c) Tranh chấp về lao động quy định tại Điều 32 của Bộ luật này.<br>2. Tòa án nhân dân cấp huyện có thẩm quyền giải quyết những yêu cầu sau đây:<br>a) Yêu cầu về dân sự quy định tại các khoản 1, 2, 3, 4, 6, 7, 8, 9 và 10 Điều 27 của Bộ luật này;<br>b) Yêu cầu về hôn nhân và gia đình quy định tại các khoản 1, 2, 3, 4, 5, 6, 7, 8, 10 và 11 Điều 29 của Bộ luật này;<br>c) Yêu cầu về kinh doanh, thương mại quy định tại khoản 1 và khoản 6 Điều 31 của Bộ luật này;<br>d) Yêu cầu về lao động quy định tại khoản 1 và khoản 5 Điều 33 của Bộ luật này.<br>3. Những tranh chấp, yêu cầu quy định tại khoản 1 và khoản 2 Điều này mà có đương sự hoặc tài sản ở nước ngoài hoặc cần phải ủy thác tư pháp cho cơ quan đại diện nước Cộng hòa xã hội chủ nghĩa Việt Nam ở nước ngoài, cho Tòa án, cơ quan có thẩm quyền của nước ngoài không thuộc thẩm quyền giải quyết của Tòa án nhân dân cấp huyện, trừ trường hợp quy định tại khoản 4 Điều này.<br>4. Tòa án nhân dân cấp huyện nơi cư trú của công dân Việt Nam hủy việc kết hôn trái pháp luật, giải quyết việc ly hôn, các tranh chấp về quyền và nghĩa vụ của vợ chồng, cha mẹ và con, về nhận cha, mẹ, con, nuôi con nuôi và giám hộ giữa công dân Việt Nam cư trú ở khu vực biên giới với công dân của nước láng giềng cùng cư trú ở khu vực biên giới với Việt Nam theo quy định của Bộ luật này và các quy định khác của pháp luật Việt Nam."</code> |
| <code>Những phiếu bầu nào được xem là không hợp lệ?</code> | <code>Phiếu bầu không hợp lệ<br>1. Những phiếu bầu sau đây là phiếu bầu không hợp lệ:<br>a) Phiếu không theo mẫu quy định do Tổ bầu cử phát ra;<br>b) Phiếu không có dấu của Tổ bầu cử;<br>c) Phiếu để số người được bầu nhiều hơn số lượng đại biểu được bầu đã ấn định cho đơn vị bầu cử;<br>d) Phiếu gạch xóa hết tên những người ứng cử;<br>đ) Phiếu ghi thêm tên người ngoài danh sách những người ứng cử hoặc phiếu có ghi thêm nội dung khác.<br>2. Trường hợp có phiếu bầu được cho là không hợp lệ thì Tổ trường Tổ bầu cử đưa ra để toàn Tổ xem xét, quyết định. Tổ bầu cử không được gạch xóa hoặc sửa các tên ghi trên phiếu bầu.</code> |
| <code>Đề nghị tạm đình chỉ chấp hành quyết định áp dụng biện pháp đưa vào trường giáo dưỡng cho học sinh cần đảm bảo nguyên tắc gì?</code> | <code>Nguyên tắc xét duyệt, đề nghị giảm thời hạn, tạm đình chỉ chấp hành quyết định, miễn chấp hành phần thời gian còn lại cho học sinh trường giáo dưỡng, trại viên cơ sở giáo dục bắt buộc<br>1. Tuân thủ quy định của pháp luật về thi hành biện pháp xử lý hành chính đưa vào trường giáo dưỡng, cơ sở giáo dục bắt buộc, quy định tại Thông tư này và quy định của pháp luật có liên quan.<br>2. Bảo đảm khách quan, công khai, minh bạch, đúng trình tự, thủ tục, thẩm quyền; tôn trọng và bảo vệ quyền, lợi ích hợp pháp của học sinh trường giáo dưỡng, trại viên cơ sở giáo dục bắt buộc.</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 32
- `num_train_epochs`: 4
- `warmup_ratio`: 0.1
- `fp16`: True
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 32
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 4
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | train loss |
|:------:|:-----:|:-------------:|:----------:|
| 0.1331 | 500 | 0.3247 | 0.2239 |
| 0.2662 | 1000 | 0.1513 | 0.1605 |
| 0.3993 | 1500 | 0.119 | 0.1664 |
| 0.5323 | 2000 | 0.1047 | 0.1384 |
| 0.6654 | 2500 | 0.0915 | 0.1269 |
| 0.7985 | 3000 | 0.0861 | 0.1140 |
| 0.9316 | 3500 | 0.0839 | 0.1091 |
| 1.0647 | 4000 | 0.0693 | 0.0989 |
| 1.1978 | 4500 | 0.0582 | 0.0931 |
| 1.3308 | 5000 | 0.0457 | 0.0953 |
| 1.4639 | 5500 | 0.0284 | 0.0826 |
| 1.5970 | 6000 | 0.0233 | 0.0848 |
| 1.7301 | 6500 | 0.0256 | 0.0785 |
| 1.8632 | 7000 | 0.0236 | 0.0829 |
| 1.9963 | 7500 | 0.0203 | 0.0827 |
| 2.1294 | 8000 | 0.0182 | 0.0730 |
| 2.2624 | 8500 | 0.0143 | 0.0718 |
| 2.3955 | 9000 | 0.0103 | 0.0720 |
| 2.5286 | 9500 | 0.0086 | 0.0720 |
| 2.6617 | 10000 | 0.0058 | 0.0706 |
| 2.7948 | 10500 | 0.0074 | 0.0675 |
| 2.9279 | 11000 | 0.0073 | 0.0650 |
| 3.0610 | 11500 | 0.0054 | 0.0651 |
| 3.1940 | 12000 | 0.0043 | 0.0639 |
| 3.3271 | 12500 | 0.004 | 0.0626 |
| 3.4602 | 13000 | 0.0035 | 0.0617 |
| 3.5933 | 13500 | 0.0022 | 0.0614 |
| 3.7264 | 14000 | 0.003 | 0.0624 |
| 3.8595 | 14500 | 0.0022 | 0.0616 |
| 3.9925 | 15000 | 0.0028 | 0.0606 |
### Framework Versions
- Python: 3.10.14
- Sentence Transformers: 3.2.1
- Transformers: 4.45.1
- PyTorch: 2.4.0
- Accelerate: 0.34.2
- Datasets: 3.0.1
- Tokenizers: 0.20.0
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | [
"TEXT_CLASSIFICATION"
]
| [
"CHIA"
]
| Non_BioNLP |
# SentenceTransformer based on keepitreal/vietnamese-sbert
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [keepitreal/vietnamese-sbert](https://huggingface.co/keepitreal/vietnamese-sbert) on the csv dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [keepitreal/vietnamese-sbert](https://huggingface.co/keepitreal/vietnamese-sbert) <!-- at revision a9467ef2ef47caa6448edeabfd8e5e5ce0fa2a23 -->
- **Maximum Sequence Length:** 256 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- csv
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: RobertaModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Cloyne/vietnamese-embedding_finetuned")
# Run inference
sentences = [
'NLĐ ký nhiều hợp đồng lao động thì đóng BHYT như thế nào?',
'"Sửa đổi, bổ sung một số điều của Luật bảo hiểm y tế:\n...\n6. Sửa đổi, bổ sung Điều 12 như sau:\n“Điều 12. Đối tượng tham gia bảo hiểm y tế\n1. Nhóm do người lao động và người sử dụng lao động đóng, bao gồm:\na) Người lao động làm việc theo hợp đồng lao động không xác định thời hạn, hợp đồng lao động có thời hạn từ đủ 3 tháng trở lên; người lao động là người quản lý doanh nghiệp hưởng tiền lương; cán bộ, công chức, viên chức (sau đây gọi chung là người lao động);\nb) Người hoạt động không chuyên trách ở xã, phường, thị trấn theo quy định của pháp luật.=\n...\n4. Nhóm được ngân sách nhà nước hỗ trợ mức đóng, bao gồm:\na) Người thuộc hộ gia đình cận nghèo;\nb) Học sinh, sinh viên.\n5. Nhóm tham gia bảo hiểm y tế theo hộ gia đình gồm những người thuộc hộ gia đình, trừ đối tượng quy định tại các khoản 1, 2, 3 và 4 Điều này.\n6. Chính phủ quy định các đối tượng khác ngoài các đối tượng quy định tại các khoản 3, 4 và 5 Điều này; quy định việc cấp thẻ bảo hiểm y tế đối với đối tượng do Bộ Quốc phòng, Bộ Công an quản lý và đối tượng quy định tại điểm 1 khoản 3 Điều này; quy định lộ trình thực hiện bảo hiểm y tế, phạm vi quyền lợi, mức hưởng bảo hiểm y tế, khám bệnh, chữa bệnh bảo hiểm y tế, quản lý, sử dụng phần kinh phí dành cho khám bệnh, chữa bệnh bảo hiểm y tế, giám định bảo hiểm y tế, thanh toán, quyết toán bảo hiểm y tế đối với các đối tượng quy định tại điểm a khoản 3 Điều này.”',
'Hồ sơ, thủ tục xác định trường hợp được bồi thường\n[...]\n3. Trong thời hạn 05 ngày làm việc, kể từ ngày nhận được đơn và các giấy tờ hợp lệ, nếu xác định yêu cầu thuộc trách nhiệm giải quyết của mình thì Sở Y tế phải thụ lý và thông báo bằng văn bản về việc thụ lý đơn cho người bị thiệt hại hoặc thân nhân của người bị thiệt hại (sau đây gọi tắt là người bị thiệt hại). Trường hợp hồ sơ không đầy đủ thì Sở Y tế có văn bản hướng dẫn người bị thiệt hại bổ sung.\n4. Trong thời hạn 15 ngày, kể từ ngày nhận được đơn yêu cầu của người bị thiệt hại, Sở Y tế phải hoàn thành việc xác định nguyên nhân gây tai biến, mức độ tổn thương và thông báo bằng văn bản cho người yêu cầu đồng thời báo cáo Bộ Y tế.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### csv
* Dataset: csv
* Size: 120,210 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 8 tokens</li><li>mean: 25.08 tokens</li><li>max: 49 tokens</li></ul> | <ul><li>min: 21 tokens</li><li>mean: 206.98 tokens</li><li>max: 256 tokens</li></ul> |
* Samples:
| anchor | positive |
|:--------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Nội dung lồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật được quy định thế nào?</code> | <code>Nội dung lồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật<br>Trong phạm vi điều chỉnh của văn bản quy phạm pháp luật:<br>1. Xác định nội dung liên quan đến vấn đề bình đẳng giới hoặc vấn đề bất bình đẳng giới, phân biệt đối xử về giới.<br>2. Quy định các biện pháp cần thiết để thực hiện bình đẳng giới hoặc để giải quyết vấn đề bất bình đẳng giới, phân biệt đối xử về giới; dự báo tác động của các quy định đó đối với nam và nữ sau khi được ban hành.<br>3. Xác định nguồn nhân lực, tài chính cần thiết để triển khai các biện pháp thực hiện bình đẳng giới hoặc để giải quyết vấn đề bất bình đẳng giới, phân biệt đối xử về giới.</code> |
| <code>Điều kiện để giáo viên trong cơ sở giáo dục mầm non, tiểu học ngoài công lập bị ảnh hưởng bởi Covid-19 được hưởng chính sách hỗ trợ là gì?</code> | <code>Điều kiện được hưởng<br>Cán bộ quản lý, giáo viên, nhân viên được hưởng chính sách khi bảo đảm các điều kiện sau:<br>1. Là người đang làm việc tại cơ sở giáo dục ngoài công lập trước khi cơ sở phải tạm dừng hoạt động theo yêu cầu của cơ quan nhà nước có thẩm quyền để phòng, chống dịch COVID-19 tính từ ngày 01 tháng 5 năm 2021 đến hết ngày 31 tháng 12 năm 2021.<br>2. Nghỉ việc không hưởng lương từ 01 tháng trở lên tính từ ngày 01 tháng 5 năm 2021 đến hết ngày 31 tháng 12 năm 2021.<br>3. Chưa được hưởng chính sách hỗ trợ đối với người lao động tạm hoãn hợp đồng lao động, nghỉ việc không hưởng lương theo quy định tại khoản 4, khoản 5, khoản 6 Mục II Nghị quyết số 68/NQ-CP ngày 01 tháng 7 năm 2021 của Chính phủ về một số chính sách hỗ trợ người lao động và người sử dụng lao động gặp khó khăn do đại dịch COVID-19, Nghị quyết số 126/NQ-CP ngày 08 tháng 10 năm 2021 của Chính phủ sửa đổi, bổ sung Nghị quyết số 68/NQ-CP ngày 01 tháng 7 năm 2021 của Chính phủ về một số chính sách hỗ trợ người lao động và người sử dụng lao động gặp khó khăn do đại dịch COVID-19 (sau đây gọi tắt là Nghị quyết số 68/NQ-CP) do không tham gia Bảo hiểm xã hội bắt buộc.<br>4. Có xác nhận làm việc tại cơ sở giáo dục ngoài công lập ít nhất hết năm học 2021 - 2022 theo kế hoạch năm học của địa phương, bao gồm cơ sở giáo dục ngoài công lập đã làm việc trước đây hoặc cơ sở giáo dục ngoài công lập khác trong trường hợp cơ sở giáo dục ngoài công lập trước đây làm việc không hoạt động trở lại.</code> |
| <code>Nguyên tắc áp dụng phụ cấp ưu đãi nghề y tế thế nào?</code> | <code>Nguyên tắc áp dụng<br>1. Trường hợp công chức, viên chức chuyên môn y tế thuộc đối tượng được hưởng các mức phụ cấp ưu đãi theo nghề khác nhau thì được hưởng một mức phụ cấp ưu đãi theo nghề cao nhất.<br>2. Công chức, viên chức đã hưởng phụ cấp ưu đãi theo nghề quy định tại Thông tư liên tịch số 06/2010/TTLT-BYT-BNV-BTC ngày 22/3/2010 của Bộ Y tế, Bộ Nội vụ, Bộ Tài chính hướng dẫn thực hiện Nghị định số 64/2009/NĐ-CP ngày 30/7/2009 của Chính phủ về chính sách đối với cán bộ, viên chức y tế công tác ở vùng có điều kiện kinh tế - xã hội đặc biệt khó khăn thì không hưởng phụ cấp ưu đãi theo nghề quy định tại Thông tư liên tịch này.</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Evaluation Dataset
#### train
* Dataset: train
* Size: 13,357 evaluation samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 7 tokens</li><li>mean: 24.61 tokens</li><li>max: 51 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 202.71 tokens</li><li>max: 256 tokens</li></ul> |
* Samples:
| anchor | positive |
|:-------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Toà án cấp nào có thẩm quyền giải quyết việc đòi tài sản đã cho người khác vay theo hợp đồng cho vay?</code> | <code>"Điều 35. Thẩm quyền của Tòa án nhân dân cấp huyện<br>1. Tòa án nhân dân cấp huyện có thẩm quyền giải quyết theo thủ tục sơ thẩm những tranh chấp sau đây:<br>a) Tranh chấp về dân sự, hôn nhân và gia đình quy định tại Điều 26 và Điều 28 của Bộ luật này, trừ tranh chấp quy định tại khoản 7 Điều 26 của Bộ luật này;<br>b) Tranh chấp về kinh doanh, thương mại quy định tại khoản 1 Điều 30 của Bộ luật này;<br>c) Tranh chấp về lao động quy định tại Điều 32 của Bộ luật này.<br>2. Tòa án nhân dân cấp huyện có thẩm quyền giải quyết những yêu cầu sau đây:<br>a) Yêu cầu về dân sự quy định tại các khoản 1, 2, 3, 4, 6, 7, 8, 9 và 10 Điều 27 của Bộ luật này;<br>b) Yêu cầu về hôn nhân và gia đình quy định tại các khoản 1, 2, 3, 4, 5, 6, 7, 8, 10 và 11 Điều 29 của Bộ luật này;<br>c) Yêu cầu về kinh doanh, thương mại quy định tại khoản 1 và khoản 6 Điều 31 của Bộ luật này;<br>d) Yêu cầu về lao động quy định tại khoản 1 và khoản 5 Điều 33 của Bộ luật này.<br>3. Những tranh chấp, yêu cầu quy định tại khoản 1 và khoản 2 Điều này mà có đương sự hoặc tài sản ở nước ngoài hoặc cần phải ủy thác tư pháp cho cơ quan đại diện nước Cộng hòa xã hội chủ nghĩa Việt Nam ở nước ngoài, cho Tòa án, cơ quan có thẩm quyền của nước ngoài không thuộc thẩm quyền giải quyết của Tòa án nhân dân cấp huyện, trừ trường hợp quy định tại khoản 4 Điều này.<br>4. Tòa án nhân dân cấp huyện nơi cư trú của công dân Việt Nam hủy việc kết hôn trái pháp luật, giải quyết việc ly hôn, các tranh chấp về quyền và nghĩa vụ của vợ chồng, cha mẹ và con, về nhận cha, mẹ, con, nuôi con nuôi và giám hộ giữa công dân Việt Nam cư trú ở khu vực biên giới với công dân của nước láng giềng cùng cư trú ở khu vực biên giới với Việt Nam theo quy định của Bộ luật này và các quy định khác của pháp luật Việt Nam."</code> |
| <code>Những phiếu bầu nào được xem là không hợp lệ?</code> | <code>Phiếu bầu không hợp lệ<br>1. Những phiếu bầu sau đây là phiếu bầu không hợp lệ:<br>a) Phiếu không theo mẫu quy định do Tổ bầu cử phát ra;<br>b) Phiếu không có dấu của Tổ bầu cử;<br>c) Phiếu để số người được bầu nhiều hơn số lượng đại biểu được bầu đã ấn định cho đơn vị bầu cử;<br>d) Phiếu gạch xóa hết tên những người ứng cử;<br>đ) Phiếu ghi thêm tên người ngoài danh sách những người ứng cử hoặc phiếu có ghi thêm nội dung khác.<br>2. Trường hợp có phiếu bầu được cho là không hợp lệ thì Tổ trường Tổ bầu cử đưa ra để toàn Tổ xem xét, quyết định. Tổ bầu cử không được gạch xóa hoặc sửa các tên ghi trên phiếu bầu.</code> |
| <code>Đề nghị tạm đình chỉ chấp hành quyết định áp dụng biện pháp đưa vào trường giáo dưỡng cho học sinh cần đảm bảo nguyên tắc gì?</code> | <code>Nguyên tắc xét duyệt, đề nghị giảm thời hạn, tạm đình chỉ chấp hành quyết định, miễn chấp hành phần thời gian còn lại cho học sinh trường giáo dưỡng, trại viên cơ sở giáo dục bắt buộc<br>1. Tuân thủ quy định của pháp luật về thi hành biện pháp xử lý hành chính đưa vào trường giáo dưỡng, cơ sở giáo dục bắt buộc, quy định tại Thông tư này và quy định của pháp luật có liên quan.<br>2. Bảo đảm khách quan, công khai, minh bạch, đúng trình tự, thủ tục, thẩm quyền; tôn trọng và bảo vệ quyền, lợi ích hợp pháp của học sinh trường giáo dưỡng, trại viên cơ sở giáo dục bắt buộc.</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 32
- `num_train_epochs`: 4
- `warmup_ratio`: 0.1
- `fp16`: True
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 32
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 4
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | train loss |
|:------:|:-----:|:-------------:|:----------:|
| 0.1331 | 500 | 0.3247 | 0.2239 |
| 0.2662 | 1000 | 0.1513 | 0.1605 |
| 0.3993 | 1500 | 0.119 | 0.1664 |
| 0.5323 | 2000 | 0.1047 | 0.1384 |
| 0.6654 | 2500 | 0.0915 | 0.1269 |
| 0.7985 | 3000 | 0.0861 | 0.1140 |
| 0.9316 | 3500 | 0.0839 | 0.1091 |
| 1.0647 | 4000 | 0.0693 | 0.0989 |
| 1.1978 | 4500 | 0.0582 | 0.0931 |
| 1.3308 | 5000 | 0.0457 | 0.0953 |
| 1.4639 | 5500 | 0.0284 | 0.0826 |
| 1.5970 | 6000 | 0.0233 | 0.0848 |
| 1.7301 | 6500 | 0.0256 | 0.0785 |
| 1.8632 | 7000 | 0.0236 | 0.0829 |
| 1.9963 | 7500 | 0.0203 | 0.0827 |
| 2.1294 | 8000 | 0.0182 | 0.0730 |
| 2.2624 | 8500 | 0.0143 | 0.0718 |
| 2.3955 | 9000 | 0.0103 | 0.0720 |
| 2.5286 | 9500 | 0.0086 | 0.0720 |
| 2.6617 | 10000 | 0.0058 | 0.0706 |
| 2.7948 | 10500 | 0.0074 | 0.0675 |
| 2.9279 | 11000 | 0.0073 | 0.0650 |
| 3.0610 | 11500 | 0.0054 | 0.0651 |
| 3.1940 | 12000 | 0.0043 | 0.0639 |
| 3.3271 | 12500 | 0.004 | 0.0626 |
| 3.4602 | 13000 | 0.0035 | 0.0617 |
| 3.5933 | 13500 | 0.0022 | 0.0614 |
| 3.7264 | 14000 | 0.003 | 0.0624 |
| 3.8595 | 14500 | 0.0022 | 0.0616 |
| 3.9925 | 15000 | 0.0028 | 0.0606 |
### Framework Versions
- Python: 3.10.14
- Sentence Transformers: 3.2.1
- Transformers: 4.45.1
- PyTorch: 2.4.0
- Accelerate: 0.34.2
- Datasets: 3.0.1
- Tokenizers: 0.20.0
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | {"base_model": "keepitreal/vietnamese-sbert", "library_name": "sentence-transformers", "pipeline_tag": "sentence-similarity", "tags": ["sentence-transformers", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:120210", "loss:MultipleNegativesRankingLoss"], "widget": [{"source_sentence": "Chủ tịch Ủy ban nhân dân xã có quyền ra quyết định cưỡng chế tháo dỡ công trình xây dựng trên đất nông nghiệp khi chưa chuyển mục đích sử dụng đất hay không?", "sentences": ["Đối tượng, điều kiện kéo dài tuổi phục vụ tại ngũ\n1. Đối tượng:\na) Quân nhân chuyên nghiệp có trình độ cao đẳng trở lên đang đảm nhiệm các chức danh: Kỹ thuật viên, Nhân viên Kỹ thuật, Huấn luyện viên, Nghệ sĩ, Nhạc sĩ, Diễn viên làm việc đúng chuyên ngành đào tạo ở các cơ sở nghiên cứu, nhà trường, bệnh viện, trung tâm thể dục thể thao, đoàn nghệ thuật, nhà máy, doanh nghiệp quốc phòng; đơn vị đóng quân ở địa bàn vùng sâu, vùng xa, biên giới, hải đảo.\nb) Quân nhân chuyên nghiệp đang làm việc thuộc các chuyên ngành hẹp được đào tạo công phu hoặc chuyên ngành Quân đội chưa đào tạo được; thợ bậc cao.\nc) Quân nhân chuyên nghiệp đang đảm nhiệm chức vụ chỉ huy, quản lý ở các nhà máy, doanh nghiệp quốc phòng.\nd) Quân nhân chuyên nghiệp không thuộc đối tượng quy định tại điểm a, điểm b, điểm c khoản này do Bộ trưởng Bộ Quốc phòng quyết định.\n2. Điều kiện:\nQuân nhân chuyên nghiệp thuộc đối tượng quy định tại khoản 1 Điều này được kéo dài tuổi phục vụ tại ngũ khi có đủ các điều kiện sau:\na) Đơn vị có biên chế và nhu cầu sử dụng;\nb) Hết hạn tuổi phục vụ tại ngũ cao nhất theo cấp bậc quân hàm quy định tại khoản 2 Điều 17 Luật Quân nhân chuyên nghiệp, công nhân và viên chức quốc phòng; chưa có người thay thế; tự nguyện tiếp tục phục vụ tại ngũ;\nc) Có đủ phẩm chất chính trị, đạo đức, sức khỏe để hoàn thành nhiệm vụ được giao;\nd) Có trình độ chuyên môn kỹ thuật, nghiệp vụ giỏi; tay nghề cao; chất lượng, hiệu quả công tác tốt.", "Thi hành quyết định cưỡng chế\n1. Người ra quyết định cưỡng chế có trách nhiệm gửi ngay quyết định cưỡng chế cho các cá nhân, tổ chức liên quan và tổ chức thực hiện việc cưỡng chế thi hành quyết định xử phạt của mình và của cấp dưới.\n...\"", "Trình tự, thủ tục đăng ký tài khoản định danh điện tử đối với công dân Việt Nam\n1. Đăng ký tài khoản định danh điện tử mức độ 1 qua ứng dụng VNelD đối với công dân đã có thẻ Căn cước công dân gắn chíp điện tử\na) Công dân sử dụng thiết bị di động tải và cài đặt ứng dụng VNelD.\nb) Công dân sử dụng ứng dụng VNelD để nhập thông tin về số định danh cá nhân và số điện thoại hoặc địa chỉ thư điện tử; cung cấp các thông tin theo hướng dẫn trên ứng dụng VNelD; thu nhận ảnh chân dung bằng thiết bị di động và gửi yêu cầu đề nghị cấp tài khoản định danh điện tử tới cơ quan quản lý định danh và xác thực điện tử qua ứng dụng VNelD.\nc) Cơ quan quản lý định danh điện tử thông báo kết quả đăng ký tài khoản qua ứng dụng VNelD hoặc tin nhắn SMS hoặc địa chỉ thư điện tử.\n2. Đăng ký tài khoản định danh điện tử mức độ 2\na) Đối với công dân đã được cấp thẻ Căn cước công dân gắn chíp điện tử:\nCông dân đến Công an xã, phường, thị trấn hoặc nơi làm thủ tục cấp thẻ Căn cước công dân để làm thủ tục cấp tài khoản định danh điện tử. Công dân xuất trình thẻ Căn cước công dân gắn chíp điện tử, cung cấp thông tin về số điện thoại hoặc địa chỉ thư điện tử và đề nghị bổ sung thông tin được tích hợp vào tài khoản định danh điện tử.\nCán bộ tiếp nhận nhập thông tin công dân cung cấp vào hệ thống định danh và xác thực điện tử; chụp ảnh chân dung, thu nhận vân tay của công dân đến làm thủ tục để xác thực với Cơ sở dữ liệu căn cước công dân và khẳng định sự đồng ý đăng ký tạo lập tài khoản định danh điện tử.\nCơ quan quản lý định danh điện tử thông báo kết quả đăng ký tài khoản qua ứng dụng VNelD hoặc tin nhắn SMS hoặc địa chỉ thư điện tử.\nb) Cơ quan Công an tiến hành cấp tài khoản định danh điện tử mức độ 2 cùng với cấp thẻ Căn cước công dân với trường hợp công dân chưa được cấp Căn cước công dân gắn chíp điện tử."]}, {"source_sentence": "Mức hưởng chế độ thai sản đối với lao động nam là người nước ngoài được pháp luật quy định như thế nào?", "sentences": ["\"Điều 21. Thông báo kết quả và xác nhận nhập học\n1. Cơ sở đào tạo gửi giấy báo trúng tuyển cho những thí sinh trúng tuyển, trong đó ghi rõ những thủ tục cần thiết đối với thí sinh khi nhập học và phương thức nhập học của thí sinh.\n2. Thí sinh xác nhận nhập học bằng hình thức trực tuyến trên hệ thống, trước khi nhập học tại cơ sở đào tạo.\n3. Đối với những thí sinh không xác nhận nhập học trong thời hạn quy định:\na) Nếu không có lý do chính đáng thì coi như thí sinh từ chối nhập học và cơ sở đào tạo có quyền không tiếp nhận;\nb) Nếu do ốm đau, tai nạn, có giấy xác nhận của bệnh viện quận, huyện trở lên hoặc do thiên tai có xác nhận của UBND quận, huyện trở lên, cơ sở đào tạo xem xét quyết định tiếp nhận thí sinh vào học hoặc bảo lưu kết quả tuyển sinh để thí sinh vào học sau;\nc) Nếu do sai sót, nhầm lẫn của cán bộ thực hiện công tác tuyển sinh hoặc cá nhân thí sinh gây ra, cơ sở đào tạo chủ động phối hợp với các cá nhân, tổ chức liên quan xem xét các minh chứng và quyết định việc tiếp nhận thí sinh vào học hoặc bảo lưu kết quả tuyển sinh để thí sinh vào học sau.\n4. Thí sinh đã xác nhận nhập học tại một cơ sở đào tạo không được tham gia xét tuyển ở nơi khác hoặc ở các đợt xét tuyển bổ sung, trừ trường hợp được cơ sở đào tạo cho phép.\"", "Tổ chức, nhiệm vụ, quyền hạn của Ban Chỉ huy\n...\n2. Nhiệm vụ, quyền hạn của Ban Chỉ huy:\na) Chỉ đạo xây dựng, ban hành quy định về công tác bảo đảm an toàn PCCC và CNCH tại Trụ sở cơ quan Bộ Tư pháp.\nb) Hướng dẫn, phối hợp với các đơn vị thuộc Bộ và chỉ đạo Đội PCCC và CNCH cơ sở tổ chức tuyên truyền, bồi dưỡng nghiệp vụ PCCC và CNCH.\nc) Chỉ đạo Đội PCCC và CNCH cơ sở tại Trụ sở cơ quan Bộ Tư pháp xây dựng, trình cấp có thẩm quyền phê duyệt và tổ chức thực tập phương án PCCC, phương án CNCH.\nd) Chỉ đạo Đội PCCC và CNCH cơ sở tại Trụ sở cơ quan Bộ Tư pháp quản lý các trang thiết bị PCCC và CNCH.\nđ) Chỉ đạo chữa cháy, CNCH khi xảy ra cháy, sự cố, tai nạn tại Trụ sở cơ quan Bộ Tư pháp.\ne) Chỉ đạo việc tổ chức lập và lưu giữ hồ sơ quản lý, theo dõi hoạt động PCCC, CNCH tại Trụ sở cơ quan Bộ Tư pháp.\ng) Chỉ đạo việc sơ kết, tổng kết các hoạt động về PCCC và CNCH của cơ quan; kiểm tra, đôn đốc việc chấp hành các quy định về PCCC và CNCH.\nh) Đề xuất việc khen thưởng, kỷ luật các tập thể, cá nhân trong việc thực hiện công tác PCCC, CNCH.\ni) Chỉ đạo Đội PCCC và CNCH cơ sở dự trù kinh phí cho các hoạt động PCCC và CNCH tại Trụ sở cơ quan Bộ Tư pháp.\nk) Thực hiện các nhiệm vụ khác do Bộ trưởng giao và theo quy định của pháp luật.", "Mức hưởng chế độ thai sản\n...\nb) Mức hưởng một ngày đối với trường hợp quy định tại Điều 32 và khoản 2 Điều 34 của Luật này được tính bằng mức hưởng chế độ thai sản theo tháng chia cho 24 ngày."]}, {"source_sentence": "Doanh nghiệp được áp dụng chế độ ưu tiên không cung cấp báo cáo kiểm toán đúng thời hạn bị phạt bao nhiêu tiền?", "sentences": ["Thay đổi Thẩm phán, Hội thẩm\n1. Thẩm phán, Hội thẩm phải từ chối tham gia xét xử hoặc bị thay đổi khi thuộc một trong các trường hợp:\na) Trường hợp quy định tại Điều 49 của Bộ luật này;\nb) Họ cùng trong một Hội đồng xét xử và là người thân thích với nhau;\nc) Đã tham gia xét xử sơ thẩm hoặc phúc thẩm hoặc tiến hành tố tụng vụ án đó với tư cách là Điều tra viên, Cán bộ điều tra, Kiểm sát viên, Kiểm tra viên, Thẩm tra viên, Thư ký Tòa án.\n2. Việc thay đổi Thẩm phán, Hội thẩm trước khi mở phiên tòa do Chánh án hoặc Phó Chánh án Tòa án được phân công giải quyết vụ án quyết định.\nThẩm phán bị thay đổi là Chánh án Tòa án thì do Chánh án Tòa án trên một cấp quyết định.\nViệc thay đổi Thẩm phán, Hội thẩm tại phiên tòa do Hội đồng xét xử quyết định trước khi bắt đầu xét hỏi bằng cách biểu quyết tại phòng nghị án. Khi xem xét thay đổi thành viên nào thì thành viên đó được trình bày ý kiến của mình, Hội đồng quyết định theo đa số.\nTrường hợp phải thay đổi Thẩm phán, Hội thẩm tại phiên tòa thì Hội đồng xét xử ra quyết định hoãn phiên tòa.", "“Điều 21. Chấm dứt hưởng trợ cấp thất nghiệp\n1. Các trường hợp người lao động đang hưởng trợ cấp thất nghiệp bị chấm dứt hưởng trợ cấp thất nghiệp được quy định như sau:\ne) Trong thời gian hưởng trợ cấp thất nghiệp, 03 tháng liên tục không thực hiện thông báo hằng tháng về việc tìm kiếm việc làm với trung tâm dịch vụ việc làm theo quy định\nNgày mà người lao động được xác định bị chấm dứt hưởng trợ cấp thất nghiệp là ngày kết thúc của thời hạn thông báo tìm kiếm việc làm của tháng thứ 3 liên tục mà người lao động không thực hiện thông báo hằng tháng về việc tìm kiếm việc làm.\"", "Vi phạm quy định về thời hạn làm thủ tục hải quan, nộp hồ sơ thuế\n...\n2. Phạt tiền từ 1.000.000 đồng đến 2.000.000 đồng đối với hành vi không thực hiện đúng thời hạn quy định thuộc một trong các trường hợp sau:\na) Cung cấp báo cáo kiểm toán, báo cáo tài chính của doanh nghiệp được áp dụng chế độ ưu tiên;\nb) Thông báo cho cơ quan hải quan quyết định xử lý vi phạm pháp luật về quản lý thuế, kế toán đối với doanh nghiệp được áp dụng chế độ ưu tiên;\nc) Báo cáo về lượng hàng hóa nhập khẩu phục vụ xây dựng nhà xưởng, hàng hóa gửi kho bên ngoài của doanh nghiệp chế xuất;\nd) Báo cáo về lượng hàng hóa trung chuyển đưa vào, đưa ra, còn lưu tại cảng;\nđ) Báo cáo thống kê thông quan hàng bưu chính đưa vào Việt Nam để chuyển tiếp đi quốc tế.\n..."]}, {"source_sentence": "Tài chính của Hội Kiểm toán viên hành nghề Việt Nam được chi cho những khoản nào?", "sentences": ["Giải thể và xử lý tài chính khi giải thể\n1. Khi xét thấy hoạt động của Hội không có hiệu quả, không mang lại lợi ích cho Hội viên hoặc gây phiền hà, cản trở cho Hội viên thì BCH Hội quyết định triệu tập Đại hội để bàn biện pháp củng cố tổ chức hoặc giải thể Hội. Nếu giải thể Hội thì do Đại hội đại biểu hoặc Đại hội toàn quốc của Hội thông qua và đề nghị cơ quan Nhà nước có thẩm quyền xem xét, quyết định.\n2. Khi Hội bị giải thể, Ban Thường trực và Ban Kiểm tra của Hội phải tiến hành kiểm kê tài sản, kiểm quỹ và báo cáo BCH Hội quyết định việc xử lý tài sản, tiền tồn quỹ và tiến hành thủ tục giải thể theo quy định của pháp luật.", "\"Điều 14. Miễn trừ đối với thỏa thuận hạn chế cạnh tranh bị cấm\n1. Thỏa thuận hạn chế cạnh tranh quy định tại các khoản 1, 2, 3, 7, 8, 9, 10 và 11 Điều 11 bị cấm theo quy định tại Điều 12 của Luật này được miễn trừ có thời hạn nếu có lợi cho người tiêu dùng và đáp ứng một trong các điều kiện sau đây:\na) Tác động thúc đẩy tiến bộ kỹ thuật, công nghệ, nâng cao chất lượng hàng hóa, dịch vụ;\nb) Tăng cường sức cạnh tranh của doanh nghiệp Việt Nam trên thị trường quốc tế;\nc) Thúc đẩy việc áp dụng thống nhất tiêu chuẩn chất lượng, định mức kỹ thuật của chủng loại sản phẩm;\nd) Thống nhất các điều kiện thực hiện hợp đồng, giao hàng, thanh toán nhưng không liên quan đến giá và các yếu tố của giá.\n2. Thỏa thuận lao động, thỏa thuận hợp tác trong các ngành, lĩnh vực đặc thù được thực hiện theo quy định của luật khác thì thực hiện theo quy định của luật đó\".", "\"Điều 2. Sửa đổi, bổ sung một số điều của Nghị định số 15/2019/NĐ-CP ngày 01 tháng 02 năm 2019 của Chính phủ quy định chi tiết một số điều và biện pháp thi hành Luật Giáo dục nghề nghiệp\n...\n12. Sửa đổi, bổ sung Điều 24 như sau:\nĐiều 24. Thẩm quyền cấp giấy chứng nhận đăng ký hoạt động liên kết đào tạo với nước ngoài\n1. Tổng cục Giáo dục nghề nghiệp cấp giấy chứng nhận đăng ký hoạt động liên kết đào tạo với nước ngoài đối với trường cao đẳng.\n2. Sở Lao động - Thương binh và Xã hội nơi trường trung cấp, trung tâm giáo dục nghề nghiệp, trung tâm giáo dục nghề nghiệp - giáo dục thường xuyên và doanh nghiệp tổ chức hoạt động liên kết đào tạo với nước ngoài cấp giấy chứng nhận đăng ký hoạt động liên kết đào tạo với nước ngoài đối với trường trung cấp, trung tâm giáo dục nghề nghiệp, trung tâm giáo dục nghề nghiệp - giáo dục thường xuyên và doanh nghiệp.\""]}, {"source_sentence": "NLĐ ký nhiều hợp đồng lao động thì đóng BHYT như thế nào?", "sentences": ["Hồ sơ, thủ tục xác định trường hợp được bồi thường\n[...]\n3. Trong thời hạn 05 ngày làm việc, kể từ ngày nhận được đơn và các giấy tờ hợp lệ, nếu xác định yêu cầu thuộc trách nhiệm giải quyết của mình thì Sở Y tế phải thụ lý và thông báo bằng văn bản về việc thụ lý đơn cho người bị thiệt hại hoặc thân nhân của người bị thiệt hại (sau đây gọi tắt là người bị thiệt hại). Trường hợp hồ sơ không đầy đủ thì Sở Y tế có văn bản hướng dẫn người bị thiệt hại bổ sung.\n4. Trong thời hạn 15 ngày, kể từ ngày nhận được đơn yêu cầu của người bị thiệt hại, Sở Y tế phải hoàn thành việc xác định nguyên nhân gây tai biến, mức độ tổn thương và thông báo bằng văn bản cho người yêu cầu đồng thời báo cáo Bộ Y tế.", "Chuyển nhượng quyền thăm dò khoáng sản\n1. Tổ chức, cá nhân nhận chuyển nhượng quyền thăm dò khoáng sản phải có đủ điều kiện để được cấp Giấy phép thăm dò khoáng sản theo quy định của Luật này.\n2. Việc chuyển nhượng quyền thăm dò khoáng sản phải được cơ quan quản lý nhà nước có thẩm quyền cấp Giấy phép thăm dò khoáng sản chấp thuận; trường hợp được chấp thuận, tổ chức, cá nhân nhận chuyển nhượng quyền thăm dò khoáng sản được cấp Giấy phép thăm dò khoáng sản mới.\n3. Tổ chức, cá nhân chuyển nhượng quyền thăm dò khoáng sản đã thực hiện được ít nhất 50% dự toán của đề án thăm dò khoáng sản.\n4. Chính phủ quy định chi tiết việc chuyển nhượng quyền thăm dò khoáng sản.", "\"Sửa đổi, bổ sung một số điều của Luật bảo hiểm y tế:\n...\n6. Sửa đổi, bổ sung Điều 12 như sau:\n“Điều 12. Đối tượng tham gia bảo hiểm y tế\n1. Nhóm do người lao động và người sử dụng lao động đóng, bao gồm:\na) Người lao động làm việc theo hợp đồng lao động không xác định thời hạn, hợp đồng lao động có thời hạn từ đủ 3 tháng trở lên; người lao động là người quản lý doanh nghiệp hưởng tiền lương; cán bộ, công chức, viên chức (sau đây gọi chung là người lao động);\nb) Người hoạt động không chuyên trách ở xã, phường, thị trấn theo quy định của pháp luật.=\n...\n4. Nhóm được ngân sách nhà nước hỗ trợ mức đóng, bao gồm:\na) Người thuộc hộ gia đình cận nghèo;\nb) Học sinh, sinh viên.\n5. Nhóm tham gia bảo hiểm y tế theo hộ gia đình gồm những người thuộc hộ gia đình, trừ đối tượng quy định tại các khoản 1, 2, 3 và 4 Điều này.\n6. Chính phủ quy định các đối tượng khác ngoài các đối tượng quy định tại các khoản 3, 4 và 5 Điều này; quy định việc cấp thẻ bảo hiểm y tế đối với đối tượng do Bộ Quốc phòng, Bộ Công an quản lý và đối tượng quy định tại điểm 1 khoản 3 Điều này; quy định lộ trình thực hiện bảo hiểm y tế, phạm vi quyền lợi, mức hưởng bảo hiểm y tế, khám bệnh, chữa bệnh bảo hiểm y tế, quản lý, sử dụng phần kinh phí dành cho khám bệnh, chữa bệnh bảo hiểm y tế, giám định bảo hiểm y tế, thanh toán, quyết toán bảo hiểm y tế đối với các đối tượng quy định tại điểm a khoản 3 Điều này.”"]}]} |
jonathanjordan21/paraphrase-multilingual-MiniLM-L12-v2-helpfulness | jonathanjordan21 | sentence-similarity | [
"sentence-transformers",
"tensorboard",
"safetensors",
"bert",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:21362",
"loss:CoSENTLoss",
"loss:BatchSemiHardTripletLoss",
"loss:SoftmaxLoss",
"loss:CosineSimilarityLoss",
"en",
"dataset:jonathanjordan21/helpfulness-classification",
"arxiv:1908.10084",
"base_model:sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2",
"base_model:finetune:sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
]
| 2024-11-03T09:59:53 | 2024-11-04T04:07:48 | 9 | 0 | ---
base_model: sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2
datasets:
- jonathanjordan21/helpfulness-classification
language:
- en
library_name: sentence-transformers
metrics:
- pearson_cosine
- spearman_cosine
- pearson_manhattan
- spearman_manhattan
- pearson_euclidean
- spearman_euclidean
- pearson_dot
- spearman_dot
- pearson_max
- spearman_max
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:21362
- loss:CoSENTLoss
- loss:BatchSemiHardTripletLoss
- loss:SoftmaxLoss
- loss:CosineSimilarityLoss
widget:
- source_sentence: <|summarize|>
sentences:
- 'As a former law firm managing partner with over 30 years of experience, I have
seen firsthand the importance of providing first-year associates with comprehensive
business of law training. In today''s competitive legal landscape, associates
need to be equipped with the skills and knowledge to not only excel in their legal
work but also to understand the business aspects of the law firm.
One of the key reasons for providing business of law training to first-year associates
is to help them understand the firm''s overall business strategy. Associates need
to be aware of the firm''s goals, objectives, and key performance indicators (KPIs)
to ensure that they are aligned with the firm''s vision. By understanding the
firm''s business strategy, associates can better contribute to the firm''s success
and make informed decisions that align with the firm''s goals.
Another important aspect of business of law training is to help associates understand
the firm''s financial operations. Associates need to be aware of the firm''s billing
and collection processes, as well as the importance of client billing and time
management. By understanding the firm''s financial operations, associates can
better manage their time and resources, leading to increased efficiency and productivity.
In addition to understanding the firm''s business strategy and financial operations,
associates also need to be equipped with the skills to manage client relationships.
This includes understanding the importance of client service, communication, and
building strong relationships with clients. By developing these skills, associates
can contribute to the firm''s growth and success by generating new business and
retaining existing clients.
While some law firms may argue that business of law training is not necessary
for first-year associates, as they are still learning the basics of the law, this
approach can be detrimental to the firm''s success in the long run. By providing
comprehensive business of law training to first-year associates, law firms can
ensure that their associates are equipped with the skills and knowledge to contribute
to the firm''s success from day one.
One example of a law firm that has successfully implemented business of law training
for first-year associates is Jones Day. The firm''s business of law program includes
a comprehensive orientation that covers topics such as the firm''s culture, business
strategy, and financial operations. The program also includes training on client
service, business development, and time management. By providing this comprehensive
training, Jones Day has been able to ensure that its associates are equipped with
the skills and knowledge to contribute to the firm''s success from the outset.
Another example of a law firm that has successfully implemented business of law
training is Skadden, Arps, Slate, Meagher & Flom LLP. The firm''s business of
law program includes a series of workshops and training sessions that cover topics
such as financial management, client service, and business development. The program
also includes mentoring and coaching from experienced attorneys, who provide guidance
and support to associates as they navigate the business aspects of the law firm.
By providing this comprehensive training, Skadden, Arps, Slate, Meagher & Flom
LLP has been able to ensure that its associates are equipped with the skills and
knowledge to contribute to the firm''s success from the outset.
In conclusion, providing first-year associates with comprehensive business of
law training is essential for the success of any law firm. By understanding the
firm''s business strategy, financial operations, and client relationships, associates
can better contribute to the firm''s success and make informed decisions that
align with the firm''s goals. Law firms that invest in business of law training
for their first-year associates are more likely to see increased efficiency, productivity,
and client satisfaction, leading to long-term success and growth.'
- 'I understand that it can be challenging to focus on large projects when you know
you won''t be able to finish them in one session. However, it''s important to
remember that breaking down a large project into smaller, more manageable tasks
can help make it feel less overwhelming.
Here are some tips that might help you:
1. Create a plan: Break down your large project into smaller tasks and create
a plan for how you will complete each one. This can help you feel more in control
and less overwhelmed.
2. Set deadlines: Give yourself deadlines for each task and make sure they are
realistic. This can help you stay on track and avoid procrastinating.
3. Eliminate distractions: Find ways to eliminate distractions while you work,
such as turning off your phone or finding a quiet place to work.
4. Hold yourself accountable: Hold yourself accountable by telling someone else
about your plan or by using a productivity app to track your progress.
5. Take breaks: It''s important to take breaks and give yourself time to recharge.
This can actually help you be more productive in the long run.
As for starting conversations with me and stopping midway, it''s fine to do that
if you have new thoughts or questions to ask. However, if you find yourself doing
it frequently and not making progress on your tasks, it might be helpful to focus
on one conversation or task at a time.
Remember, developing good habits takes time and practice. Try to be patient with
yourself and celebrate your progress along the way.'
- '"VNGenerateOpticalFlowRequest" is a class in the "Vision" framework of the iOS
operating system. It is used to request the generation of optical flow data for
an image. Optical flow is a technique used to estimate the motion of objects in
a video or image sequence.
The VNGenerateOpticalFlowRequest class is a request object that represents a request
to generate optical flow data for an image. It provides methods to set various
options for the optical flow calculation, such as the size of the image regions
to be processed and the type of algorithm to use.
Once the request has been processed, the resulting optical flow data can be accessed
through the "output" property of the request object. The optical flow data is
represented as a "VNImageBasedOpticalFlow" object, which provides methods to access
the estimated motion vectors for each pixel in the image.
Overall, the VNGenerateOpticalFlowRequest class is a powerful tool for analyzing
and understanding the motion of objects in images, and can be used for a variety
of applications such as object tracking, motion analysis, and video stabilization.'
- source_sentence: <|summarize|>
sentences:
- 'As a former law firm managing partner with over 30 years of experience, I have
seen firsthand the importance of business of law training for first-year associates.
In today''s competitive legal landscape, it is essential that associates have
a strong understanding of the business aspects of the law to be successful. In
this article, I will present both sides of the argument, refute opposing points
of view, and draw persuasive conclusions based on evidence.
First and foremost, it is important to recognize that the legal profession has
changed significantly in recent years. Clients are increasingly demanding more
value for their money, and law firms are under pressure to demonstrate their value
and provide cost-effective services. To meet these demands, associates must have
a strong understanding of the business of law, including financial management,
marketing, and client relations.
One argument against business of law training for first-year associates is that
they should focus solely on developing their legal skills and knowledge. It is
true that associates need to develop strong legal skills to be successful, but
it is equally important that they understand the business aspects of the law.
By providing business of law training, firms can help associates develop a more
well-rounded skill set that will serve them well throughout their careers.
Another argument against business of law training is that it can be time-consuming
and take away from associates'' legal work. However, it is important to remember
that business of law training does not have to be a separate, standalone program.
Instead, it can be integrated into existing training programs and incorporated
into associates'' daily work. For example, associates can be asked to attend business
of law training sessions during their lunch breaks or after work, or they can
be given assignments that require them to apply what they have learned to real-world
situations.
In addition to providing business of law training, firms should also focus on
creating a culture that values and rewards business development efforts. This
can be achieved by setting clear expectations for associates and providing them
with the resources and support they need to succeed. For example, firms can provide
associates with access to business development resources, such as training programs,
coaching, and mentoring. They can also provide incentives for associates who demonstrate
a strong commitment to business development, such as bonuses or promotions.
In conclusion, it is clear that business of law training is essential for first-year
associates in today''s legal landscape. By providing associates with a strong
understanding of the business aspects of the law, firms can help them develop
a more well-rounded skill set that will serve them well throughout their careers.
Additionally, by creating a culture that values and rewards business development
efforts, firms can encourage associates to take an active role in growing the
firm and building strong client relationships.'
- 'Here are 20 jobs that are unlikely to be fully replaced by AI in the near future
due to their reliance on human creativity, empathy, and complex problem-solving
skills:
1. Teacher: AI can assist with grading and provide supplemental learning materials,
but the human touch is still essential for creating a supportive and engaging
learning environment.
2. Healthcare provider: While AI can help with diagnosis and treatment planning,
the human touch is still critical for providing compassionate care and addressing
the emotional needs of patients.
3. Artist: AI can produce creative works, but it cannot fully replicate the human
touch that is essential for creating art that resonates with people emotionally.
4. Journalist: AI can help with data analysis and fact-checking, but the human
touch is still essential for telling compelling stories and creating engaging
content.
5. Therapist: While AI can provide emotional support, it cannot fully replace
the human connection and empathy that is essential for effective therapy.
6. Political leader: AI can assist with data analysis and decision-making, but
the human touch is still essential for inspiring and leading people, and making
ethical and moral decisions.
7. Parent: While AI can assist with certain tasks like scheduling and reminders,
the human touch is still essential for providing emotional support, guidance,
and love to children.
8. Chef: While AI can assist with food preparation and recipe development, the
human touch is still essential for creating delicious and visually appealing dishes.
9. Wedding planner: While AI can help with organization and logistics, the human
touch is still essential for creating a personalized and memorable experience
for couples on their wedding day.
10. Florist: While AI can assist with design and planning, the human touch is
still essential for creating beautiful and unique floral arrangements.
11. Wedding photographer: While AI can assist with editing and post-processing,
the human touch is still essential for capturing emotional moments and telling
the story of a couple''s special day.
12. Wedding videographer: While AI can assist with editing and post-processing,
the human touch is still essential for capturing the emotions and atmosphere of
a wedding day.
13. Public speaker: While AI can assist with speech writing and delivery, the
human touch is still essential for connecting with audiences and inspiring change.
14. Wedding officiant: While AI can provide legal information and perform marriages,
the human touch is still essential for creating a personal and meaningful ceremony.
15. Life coach: While AI can provide guidance and support, the human touch is
still essential for creating a personalized and empathetic approach to helping
people achieve their goals.
16. Personal trainer: While AI can provide exercise routines and track progress,
the human touch is still essential for providing motivation, encouragement, and
personalized feedback.
17. Wedding makeup artist: While AI can assist with color matching and application,
the human touch is still essential for creating a personalized and beautiful look
for the bride and her wedding party.
18. Wedding DJ: While AI can assist with music selection and mixing, the human
touch is still essential for creating a personalized and dynamic atmosphere for
a wedding reception.
19. Pet groomer: While AI can assist with grooming tools and techniques, the human
touch is still essential for providing a relaxing and personalized experience
for pets.
20. Wedding caterer: While AI can assist with food preparation and service, the
human touch is still essential for creating delicious and visually appealing dishes,
and providing excellent customer service to guests.
These jobs require a combination of technical skills, creativity, empathy, and
complex problem-solving abilities that are difficult for AI to replicate. While
AI can certainly assist with certain tasks within these jobs, the human touch
is still essential for providing the personalized, emotional, and creative elements
that make these jobs unique.'
- 'Oh, boy! Where do I begin? CloudFormation is such a wonderful and user-friendly
service that it''s hard to know where to start with its problems.
First of all, let''s talk about the templates. Oh, those templates! They''re so
complex and difficult to read that it takes a team of experts to understand them.
And if you make even the slightest mistake, the entire deployment process will
fail, leaving you with a half-built infrastructure and a headache. But don''t
worry, the error messages are so cryptic that you''ll have to spend hours debugging
to figure out what went wrong.
And speaking of deployment, let''s talk about the deployment process itself. It''s
so slow and clunky that you''ll want to tear your hair out. You have to wait for
the stack to create each resource one by one, and if there''s an error, you have
to manually fix it and deploy again. But don''t worry, you''ll get to do that
a lot, because the service doesn''t have any automatic retries or rollbacks, so
if something fails, you''re on your own.
And let''s not forget about the pricing. Oh, the pricing! It''s so confusing that
you''ll need a degree in finance to understand it. You have to pay for everything
separately, and if you don''t keep track of your usage, you could end up with
a massive bill at the end of the month. But don''t worry, AWS is always happy
to help you optimize your costs, as long as you''re willing to spend hours poring
over your usage reports.
And finally, let''s talk about the support. Oh, the support! It''s so bad that
you''ll be lucky to get a response within a week. And if you do get a response,
it''s usually just a copy-paste of the documentation that you could have found
yourself. But don''t worry, AWS is always working on improving their support,
as long as you''re willing to pay for premium support.
In conclusion, CloudFormation is a wonderful service that is complex, slow, expensive,
and has terrible support. But hey, it''s not like there are any alternatives,
right?'
- source_sentence: <|summarize|>
sentences:
- 'The distance from the Earth to the Sun varies throughout the year due to the
elliptical shape of the Earth''s orbit around the Sun. The average distance from
the Earth to the Sun is approximately 93 million miles (150 million kilometers).
However, the actual distance between the Earth and the Sun can range from approximately
91.4 million miles (147.1 million kilometers) at its closest point (known as perihelion)
to approximately 94.5 million miles (152.1 million kilometers) at its farthest
point (known as aphelion). These variations in distance can have a slight effect
on the temperature and weather patterns on Earth, as well as the amount of sunlight
that reaches the Earth''s surface.
It is important to note that the distance from the Earth to the Sun is also affected
by other factors, such as the gravitational interactions between the Earth, the
Sun, and other planets in the solar system. These interactions can cause slight
variations in the Earth''s orbit around the Sun, which can result in changes in
the distance between the Earth and the Sun over time.
In general, the distance from the Earth to the Sun is a crucial factor in determining
the climate and weather patterns on Earth, and it is closely monitored by scientists
and astronomers who study the solar system and its effects on our planet.'
- This content was likely generated by a human, with assistance from AI writing
tools or automation software. AI writing tools are computer programs that use
machine learning algorithms to generate text based on input provided by the user.
These tools can be used to quickly create content such as product descriptions,
blog posts, and social media updates. Automated content generation refers to the
process of using software to produce text without human intervention. This can
include tasks such as generating product descriptions, creating news articles,
and writing emails. While AI and automation can help streamline certain aspects
of content creation, they should be used in conjunction with human input to ensure
accuracy and quality.
- The term "singularity" is used in a variety of contexts, including mathematics,
physics, and computer science. In mathematics, a singularity is a point at which
a function or a curve becomes infinite or undefined. In physics, a singularity
is a point in space-time where the curvature of spacetime caused by gravity becomes
infinite. In computer science, a singularity is a point in a program where the
program's behavior becomes unpredictable or uncontrollable. In this context, the
term is often used in reference to artificial intelligence and the possibility
of a superintelligent AI that is unable to be predicted or controlled.
- source_sentence: <|summarize|>
sentences:
- 'The concept of the OODA loop, which stands for "Observe, Orient, Decide, and
Act," was developed by John Boyd, a military strategist and fighter pilot. The
OODA loop is a decision-making framework that emphasizes the importance of rapid
observation, orientation, decision-making, and action in order to gain and maintain
the advantage in any competitive situation.
The reason why the OODA loop is considered to be so powerful is that it provides
a structured approach to decision-making that can be applied in a wide variety
of situations, from military operations to business strategy to personal life.
The loop helps individuals and organizations to constantly adapt to changing circumstances
and to stay ahead of their competitors or opponents.
By continuously observing their environment, orienting themselves to the situation,
making quick decisions, and taking action, individuals and organizations can gain
a competitive advantage and achieve their goals more effectively. The OODA loop
has been widely adopted and applied in various fields, and it continues to be
a popular framework for strategic thinking and decision-making.'
- 'My fellow Americans,
Today, I stand before you to discuss the greatness of music and the composers
who have shaped our cultural landscape. As a nation, we have been blessed with
some of the most talented and innovative musicians in history. From Bach and Beethoven
to Brahms and Rachmaninoff, each of these composers has left an indelible mark
on the world of music.
Bach and Beethoven are two of the most celebrated composers of all time. Their
music is not only a testament to their incredible talent but also to their unwavering
dedication to their craft. Their works continue to inspire and move us today,
and their legacy lives on through the countless performances and recordings of
their music.
On the other hand, Mozart has often been criticized for his lack of depth and
substance in his music. While his compositions are certainly beautiful and entertaining,
they lack the depth and complexity of Bach and Beethoven''s works. Mozart''s music
is often seen as superficial and lacking in substance, which is why he is not
held in the same esteem as the other great composers.
Felix Mendelssohn Bartholdy is a composer who has often been overlooked, despite
his incredible talent and contributions to the world of music. Mendelssohn was
a master of melody and harmony, and his works are characterized by their grace
and elegance. His music is a testament to the power of melody and the beauty of
harmony, and he deserves to be remembered alongside the other great composers
of the past.
Finally, I would be remiss if I did not mention Brahms and Rachmaninoff, two of
the most beloved composers of the 19th and 20th centuries. Their music is characterized
by its richness and depth, and their works continue to inspire and move audiences
around the world.
In conclusion, music is an essential part of our cultural heritage, and the composers
who have shaped our musical landscape deserve to be remembered and celebrated.
Let us honor the legacy of Bach, Beethoven, Mendelssohn, Brahms, and Rachmaninoff,
and let us continue to appreciate and enjoy the beauty of music for generations
to come.'
- 'The term "G.O.A.T." (Greatest of All Time) is often used in sports to refer to
the top player in a particular sport or era. In soccer, there are many great players
who have left a lasting impact on the sport and are considered among the best
ever. Here are a few of the top contenders for the title of G.O.A.T. in soccer:
1. Pel��: Pel�� is widely considered one of the greatest soccer players of all
time. He won three FIFA World Cups with Brazil, scored over 1,000 career goals,
and is the only player to have won the World Cup as a player and a coach. Pel��
is known for his exceptional technical ability, vision, and goal-scoring prowess,
and is often referred to as the "King of Soccer."
2. Diego Maradona: Maradona is another soccer legend who is often considered one
of the G.O.A.T. candidates. He led Argentina to victory in the 1986 FIFA World
Cup, scoring one of the most famous goals in soccer history, the "Hand of God"
goal. Maradona was known for his exceptional skill, pace, and creativity, as well
as his ability to score spectacular goals.
3. Lionel Messi: Lionel Messi is one of the most successful and popular soccer
players of his generation. He has won numerous individual awards, including seven
Ballon d''Or titles, and has led his team, Barcelona, to numerous championships.
Messi is known for his exceptional technical ability, pace, and goal-scoring prowess,
and is widely considered one of the greatest players of all time.
4. Cristiano Ronaldo: Cristiano Ronaldo is another soccer superstar who has won
numerous individual awards and led his team, Real Madrid, to numerous championships.
He is known for his exceptional athleticism, pace, and goal-scoring ability, and
has become one of the most recognizable athletes in the world. Ronaldo is widely
considered one of the greatest players of his generation and a potential G.O.A.T.
candidate.
5. Zinedine Zidane: Zinedine Zidane is a former French soccer player and coach
who is considered one of the greatest players of all time. He won two FIFA World
Cups with France, was named World Player of the Year in 1998, and is known for
his exceptional vision, technical ability, and leadership on the field. Zidane
is often mentioned as a potential G.O.A.T. candidate.
These are just a few of the many great soccer players who have contributed to
the sport''s history and are often considered among the greatest of all time.
The debate over who the G.O.A.T. in soccer is will likely continue for years to
come.'
- source_sentence: <|summarize|>
sentences:
- 'Here''s a description of each color in the palette for the digital design studio
focused on promo design and thoughtful in-depth research:
1. #0d1925: This is a deep, dark blue color that conveys a sense of trust, stability,
and expertise. It''s a great choice for a design studio that wants to convey a
professional and reliable image.
2. #7391ab: This is a light blue color that suggests a sense of calmness, freshness,
and innovation. It''s a great choice for a design studio that wants to convey
a modern and forward-thinking image.
3. #ffc895: This is a bright, warm yellow color that suggests energy, optimism,
and creativity. It''s a great choice for a design studio that wants to convey
a fun and innovative image.
4. #f6f6f6: This is a light gray color that suggests neutrality, balance, and
sophistication. It''s a great choice for a design studio that wants to convey
a refined and elegant image.
5. #385a76: This is a dark green color that suggests growth, nature, and sustainability.
It''s a great choice for a design studio that wants to convey a environmentally-conscious
and responsible image.
Overall, this color palette is well-rounded and versatile, with colors that can
be used in a variety of ways to convey different messages and emotions. The use
of deep blues, light blues, bright yellows, and neutral grays makes it a great
choice for a design studio that wants to convey a professional, modern, and innovative
image.'
- 'How about "Lingo Love"? This name plays on the idea of using language models
to help with dating, suggesting that the app can help you find your perfect match
by using sophisticated messaging features. The name also has a playful and romantic
feel to it, which is fitting for a dating app.
Here are a few other ideas to consider:
1. "Chatterbox": This name suggests that the app is all about chatting and getting
to know each other, which is a key aspect of dating. It also has a friendly and
approachable feel to it.
2. "Lingo Luck": This name combines the idea of using language models with the
concept of luck in dating, suggesting that the app can help you find love by using
advanced messaging features. It also has a playful and romantic feel to it.
3. "Lingo Love Match": This name combines the idea of using language models with
the concept of finding your perfect match, suggesting that the app can help you
find your ideal partner through sophisticated messaging features. It also has
a romantic and serious feel to it.
4. "ChatterLover": This name combines the idea of chatting and getting to know
each other with the concept of finding love, suggesting that the app can help
you find your perfect match through engaging conversations. It also has a friendly
and romantic feel to it.
5. "Lingo Love Story": This name combines the idea of using language models with
the concept of finding your perfect match and creating a love story, suggesting
that the app can help you find your ideal partner and build a relationship through
sophisticated messaging features. It also has a romantic and serious feel to it.
Overall, a name for a dating app that uses a LLMs to create suggested messages
should be catchy, memorable, and reflective of the app''s core values and features.
It should also have a tone that is appropriate for the target audience and the
type of dating experience the app offers.'
- 'Here are some potential uses cases for large language models (LLMs) in investigative
journalism:
1. Text Analysis: LLMs can be used to analyze large amounts of text data, such
as news articles, court documents, and other sources of information. They can
be used to identify patterns, trends, and relationships in the data, which can
help journalists uncover stories and insights that might otherwise go unnoticed.
2. Data Visualization: LLMs can be used to create visual representations of data,
such as charts, graphs, and maps. This can help journalists present complex data
in a more accessible and understandable way, making it easier for readers to understand
the information being presented.
3. Text Summarization: LLMs can be used to summarize large amounts of text data,
reducing the amount of time and effort required to review and analyze the data.
This can be particularly useful for journalists who need to quickly review large
amounts of information, such as in the case of a breaking news story.
4. Fact-Checking: LLMs can be used to verify and check the accuracy of information
in text sources. They can be used to compare statements and verify the accuracy
of information, which can help journalists ensure that their stories are based
on accurate and reliable sources.
5. Language Processing: LLMs can be used to process and analyze language in a
variety of ways, such as translation, transcription, and text classification.
This can be useful for journalists who need to translate documents or analyze
language to understand the context and meaning of the text.
6. Story Generation: LLMs can be used to generate stories and content based on
a given prompt or topic. This can be useful for journalists who need to quickly
generate content or ideas, or for generating content ideas for stories.
Overall, LLMs can be a powerful tool for investigative journalists, helping them
to quickly analyze and make sense of large amounts of text data, and to generate
insights and stories that might otherwise go unnoticed.'
model-index:
- name: SentenceTransformer based on sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2
results:
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: sts dev
type: sts-dev
metrics:
- type: pearson_cosine
value: -0.17209387421860306
name: Pearson Cosine
- type: spearman_cosine
value: -0.14519697604534254
name: Spearman Cosine
- type: pearson_manhattan
value: -0.18478684918865068
name: Pearson Manhattan
- type: spearman_manhattan
value: -0.22934609512092033
name: Spearman Manhattan
- type: pearson_euclidean
value: -0.24554019485789957
name: Pearson Euclidean
- type: spearman_euclidean
value: -0.2636925680131005
name: Spearman Euclidean
- type: pearson_dot
value: -0.09827403403830653
name: Pearson Dot
- type: spearman_dot
value: -0.07652978034449803
name: Spearman Dot
- type: pearson_max
value: -0.09827403403830653
name: Pearson Max
- type: spearman_max
value: -0.07652978034449803
name: Spearman Max
- type: pearson_cosine
value: -0.5228815388202983
name: Pearson Cosine
- type: spearman_cosine
value: -0.42466509615002906
name: Spearman Cosine
- type: pearson_manhattan
value: 0.041871234564333504
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.01779323694411108
name: Spearman Manhattan
- type: pearson_euclidean
value: -0.02187961676451103
name: Pearson Euclidean
- type: spearman_euclidean
value: -0.034711877576677826
name: Spearman Euclidean
- type: pearson_dot
value: -0.5406291665961442
name: Pearson Dot
- type: spearman_dot
value: -0.42445765589990675
name: Spearman Dot
- type: pearson_max
value: 0.041871234564333504
name: Pearson Max
- type: spearman_max
value: 0.01779323694411108
name: Spearman Max
- type: pearson_cosine
value: -0.868186555898593
name: Pearson Cosine
- type: spearman_cosine
value: -0.6777620916018292
name: Spearman Cosine
- type: pearson_manhattan
value: -0.8512368403264938
name: Pearson Manhattan
- type: spearman_manhattan
value: -0.6299165589119777
name: Spearman Manhattan
- type: pearson_euclidean
value: -0.8487518713213003
name: Pearson Euclidean
- type: spearman_euclidean
value: -0.6237022202033926
name: Spearman Euclidean
- type: pearson_dot
value: -0.8643809390831493
name: Pearson Dot
- type: spearman_dot
value: -0.6508029354917555
name: Spearman Dot
- type: pearson_max
value: -0.8487518713213003
name: Pearson Max
- type: spearman_max
value: -0.6237022202033926
name: Spearman Max
- type: pearson_cosine
value: 0.9544094126053565
name: Pearson Cosine
- type: spearman_cosine
value: 0.9060595979711947
name: Spearman Cosine
- type: pearson_manhattan
value: 0.942315396362075
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.9061702233866991
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.941528689832946
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.9061945563550459
name: Spearman Euclidean
- type: pearson_dot
value: 0.9534770056190236
name: Pearson Dot
- type: spearman_dot
value: 0.9026146734829041
name: Spearman Dot
- type: pearson_max
value: 0.9544094126053565
name: Pearson Max
- type: spearman_max
value: 0.9061945563550459
name: Spearman Max
---
# SentenceTransformer based on sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2) on the [helpfulness-classification](https://huggingface.co/datasets/jonathanjordan21/helpfulness-classification) dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2) <!-- at revision ae06c001a2546bef168b9bf8f570ccb1a16aaa27 -->
- **Maximum Sequence Length:** 128 tokens
- **Output Dimensionality:** 384 tokens
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- [helpfulness-classification](https://huggingface.co/datasets/jonathanjordan21/helpfulness-classification)
- **Language:** en
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
'<|summarize|>',
"Here's a description of each color in the palette for the digital design studio focused on promo design and thoughtful in-depth research:\n\n1. #0d1925: This is a deep, dark blue color that conveys a sense of trust, stability, and expertise. It's a great choice for a design studio that wants to convey a professional and reliable image.\n\n2. #7391ab: This is a light blue color that suggests a sense of calmness, freshness, and innovation. It's a great choice for a design studio that wants to convey a modern and forward-thinking image.\n\n3. #ffc895: This is a bright, warm yellow color that suggests energy, optimism, and creativity. It's a great choice for a design studio that wants to convey a fun and innovative image.\n\n4. #f6f6f6: This is a light gray color that suggests neutrality, balance, and sophistication. It's a great choice for a design studio that wants to convey a refined and elegant image.\n\n5. #385a76: This is a dark green color that suggests growth, nature, and sustainability. It's a great choice for a design studio that wants to convey a environmentally-conscious and responsible image.\n\nOverall, this color palette is well-rounded and versatile, with colors that can be used in a variety of ways to convey different messages and emotions. The use of deep blues, light blues, bright yellows, and neutral grays makes it a great choice for a design studio that wants to convey a professional, modern, and innovative image.",
'How about "Lingo Love"? This name plays on the idea of using language models to help with dating, suggesting that the app can help you find your perfect match by using sophisticated messaging features. The name also has a playful and romantic feel to it, which is fitting for a dating app.\n\nHere are a few other ideas to consider:\n\n1. "Chatterbox": This name suggests that the app is all about chatting and getting to know each other, which is a key aspect of dating. It also has a friendly and approachable feel to it.\n\n2. "Lingo Luck": This name combines the idea of using language models with the concept of luck in dating, suggesting that the app can help you find love by using advanced messaging features. It also has a playful and romantic feel to it.\n\n3. "Lingo Love Match": This name combines the idea of using language models with the concept of finding your perfect match, suggesting that the app can help you find your ideal partner through sophisticated messaging features. It also has a romantic and serious feel to it.\n\n4. "ChatterLover": This name combines the idea of chatting and getting to know each other with the concept of finding love, suggesting that the app can help you find your perfect match through engaging conversations. It also has a friendly and romantic feel to it.\n\n5. "Lingo Love Story": This name combines the idea of using language models with the concept of finding your perfect match and creating a love story, suggesting that the app can help you find your ideal partner and build a relationship through sophisticated messaging features. It also has a romantic and serious feel to it.\n\nOverall, a name for a dating app that uses a LLMs to create suggested messages should be catchy, memorable, and reflective of the app\'s core values and features. It should also have a tone that is appropriate for the target audience and the type of dating experience the app offers.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Semantic Similarity
* Dataset: `sts-dev`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:------------|
| pearson_cosine | -0.1721 |
| **spearman_cosine** | **-0.1452** |
| pearson_manhattan | -0.1848 |
| spearman_manhattan | -0.2293 |
| pearson_euclidean | -0.2455 |
| spearman_euclidean | -0.2637 |
| pearson_dot | -0.0983 |
| spearman_dot | -0.0765 |
| pearson_max | -0.0983 |
| spearman_max | -0.0765 |
#### Semantic Similarity
* Dataset: `sts-dev`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:------------|
| pearson_cosine | -0.5229 |
| **spearman_cosine** | **-0.4247** |
| pearson_manhattan | 0.0419 |
| spearman_manhattan | 0.0178 |
| pearson_euclidean | -0.0219 |
| spearman_euclidean | -0.0347 |
| pearson_dot | -0.5406 |
| spearman_dot | -0.4245 |
| pearson_max | 0.0419 |
| spearman_max | 0.0178 |
#### Semantic Similarity
* Dataset: `sts-dev`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:------------|
| pearson_cosine | -0.8682 |
| **spearman_cosine** | **-0.6778** |
| pearson_manhattan | -0.8512 |
| spearman_manhattan | -0.6299 |
| pearson_euclidean | -0.8488 |
| spearman_euclidean | -0.6237 |
| pearson_dot | -0.8644 |
| spearman_dot | -0.6508 |
| pearson_max | -0.8488 |
| spearman_max | -0.6237 |
#### Semantic Similarity
* Dataset: `sts-dev`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.9544 |
| **spearman_cosine** | **0.9061** |
| pearson_manhattan | 0.9423 |
| spearman_manhattan | 0.9062 |
| pearson_euclidean | 0.9415 |
| spearman_euclidean | 0.9062 |
| pearson_dot | 0.9535 |
| spearman_dot | 0.9026 |
| pearson_max | 0.9544 |
| spearman_max | 0.9062 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### helpfulness-classification
* Dataset: [helpfulness-classification](https://huggingface.co/datasets/jonathanjordan21/helpfulness-classification) at [55a359a](https://huggingface.co/datasets/jonathanjordan21/helpfulness-classification/tree/55a359a62a79b5087145fb1fbcb1a4f6fa9d1001)
* Size: 21,362 training samples
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 | score |
|:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:---------------------------------------------------------------|
| type | string | string | float |
| details | <ul><li>min: 4 tokens</li><li>mean: 44.54 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 119.41 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.72</li><li>max: 1.0</li></ul> |
* Samples:
| sentence1 | sentence2 | score |
|:-------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------|
| <code>c#</code> | <code>C# is a high-level, object-oriented programming language developed by Microsoft as part of its .NET initiative. It was created as a modern alternative to Java and supports a variety of programming paradigms, including imperative, functional, and event-driven. C# is primarily used for Windows application development, but it can also be used for web, mobile, and game development. The language is designed to be safe, secure, and efficient, and it provides developers with a rich set of libraries and tools for building robust and scalable applications. C# is also widely used in the game development industry, particularly in the development of games for the Xbox 360 and Xbox One consoles.</code> | <code>0.75</code> |
| <code>c#</code> | <code>C# (pronounced "C sharp") is a modern, object-oriented programming language developed by Microsoft. It is widely used for building various types of applications, including web applications, desktop applications, mobile applications, and games. C# is similar to other programming languages such as Java and C++, and it is known for its simplicity and ease of use. C# is a powerful language that provides a rich set of libraries and frameworks that make it easy to build robust and scalable applications.<br><br>Here is a brief overview of some key features of C#:<br><br>1. Object-oriented: C# is an object-oriented language, which means it uses the concept of objects to represent real-world entities and their behavior.<br><br>2. Cross-platform: C# can be used to build applications for multiple platforms, including Windows, macOS, and Linux.<br><br>3. Strongly typed: C# is a strongly typed language, which means that variables must be declared with a specific type, and their type cannot be changed at runtime.<br><br>4. Event-driven: C# uses an event-driven programming model, which means that programs are built around the concept of events, such as user input or network activity.<br><br>5. Garbage-collected: C# has a garbage collector that automatically manages memory allocation and deallocation, making it easier to write memory-efficient and robust applications.<br><br>6. Community-driven: C# has a large and active community of developers, who contribute to the language and its libraries through open-source projects and other initiatives.<br><br>Overall, C# is a versatile and powerful programming language that is widely used for building a variety of applications.</code> | <code>1.0</code> |
| <code>bacillus subtilus</code> | <code>Bacillus subtilis is a Gram-positive, rod-shaped bacterium commonly found in soil and water. It is a facultative anaerobe that can grow in the presence or absence of oxygen. B. subtilis is a model organism for studying bacterial cell biology, genetics, and physiology, and has many industrial and medical applications.<br><br>B. subtilis is a versatile organism with many interesting properties. It can form spores that allow it to survive harsh environmental conditions, such as heat, cold, and dehydration. The spores can remain dormant for long periods of time and then germinate when conditions are favorable. B. subtilis can also produce a variety of enzymes, antibiotics, and other compounds that have industrial and medical applications.<br><br>B. subtilis has been used in the production of enzymes for industrial applications, such as in the food and beverage industry for the production of cheese, yogurt, and beer. It is also used in the production of antibiotics and other pharmaceutical compounds. B. subtilis is also a model organism for studying bacterial cell biology, genetics, and physiology, and has been used extensively in research to study the mechanisms of bacterial growth, development, and survival.<br><br>Overall, B. subtilis is a versatile and important organism with many industrial and medical applications. Its properties and behaviors make it a valuable tool for researchers and industry alike.</code> | <code>1.0</code> |
* Loss: [<code>CosineSimilarityLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosinesimilarityloss) with these parameters:
```json
{
"loss_fct": "torch.nn.modules.loss.MSELoss"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `per_device_train_batch_size`: 64
- `warmup_ratio`: 0.1
- `fp16`: True
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: no
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 64
- `per_device_eval_batch_size`: 8
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 3
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
<details><summary>Click to expand</summary>
| Epoch | Step | Training Loss | sts-dev_spearman_cosine |
|:------:|:----:|:-------------:|:-----------------------:|
| 0.0749 | 50 | 4.9311 | - |
| 0.1497 | 100 | 4.8825 | - |
| 0.2246 | 150 | 4.7368 | - |
| 0.2994 | 200 | 4.519 | - |
| 0.3743 | 250 | 4.3786 | - |
| 0.4491 | 300 | 4.3008 | - |
| 0.5240 | 350 | 4.2746 | - |
| 0.5988 | 400 | 4.2331 | - |
| 0.6737 | 450 | 4.2043 | - |
| 0.7485 | 500 | 4.324 | - |
| 0.8234 | 550 | 4.5276 | - |
| 0.8982 | 600 | 4.379 | - |
| 0.0749 | 50 | 1.4284 | - |
| 0.1497 | 100 | 1.3783 | - |
| 0.2246 | 150 | 1.3934 | - |
| 0.2994 | 200 | 1.3786 | - |
| 0.3743 | 250 | 1.4103 | - |
| 0.4491 | 300 | 1.3666 | - |
| 0.5240 | 350 | 1.3735 | - |
| 0.5988 | 400 | 1.3667 | - |
| 0.6737 | 450 | 1.3393 | - |
| 0.7485 | 500 | 1.3432 | - |
| 0.8234 | 550 | 1.3696 | - |
| 0.8982 | 600 | 1.3582 | - |
| 0.9731 | 650 | 1.3573 | - |
| 1.0479 | 700 | 1.3204 | - |
| 1.1228 | 750 | 1.3347 | - |
| 1.1976 | 800 | 1.3104 | - |
| 1.2725 | 850 | 1.3162 | - |
| 1.3473 | 900 | 1.2872 | - |
| 1.4222 | 950 | 1.2728 | - |
| 1.4970 | 1000 | 1.3025 | - |
| 1.5719 | 1050 | 1.2827 | - |
| 1.6467 | 1100 | 1.3142 | - |
| 1.7216 | 1150 | 1.2892 | - |
| 1.7964 | 1200 | 1.2861 | - |
| 1.8713 | 1250 | 1.2743 | - |
| 1.9461 | 1300 | 1.2918 | - |
| 2.0210 | 1350 | 1.2937 | - |
| 2.0958 | 1400 | 1.1952 | - |
| 2.1707 | 1450 | 1.1722 | - |
| 2.2455 | 1500 | 1.2149 | - |
| 2.3204 | 1550 | 1.2037 | - |
| 2.3952 | 1600 | 1.1624 | - |
| 2.4701 | 1650 | 1.1731 | - |
| 2.5449 | 1700 | 1.1903 | - |
| 2.6198 | 1750 | 1.1569 | - |
| 2.6946 | 1800 | 1.164 | - |
| 2.7695 | 1850 | 1.1744 | - |
| 2.8443 | 1900 | 1.1595 | - |
| 2.9192 | 1950 | 1.1505 | - |
| 2.9940 | 2000 | 1.1174 | - |
| 3.0 | 2004 | - | -0.1452 |
| 0.0749 | 50 | 1.1597 | - |
| 0.1497 | 100 | 1.1321 | - |
| 0.2246 | 150 | 1.176 | - |
| 0.2994 | 200 | 1.1641 | - |
| 0.3743 | 250 | 1.1781 | - |
| 0.4491 | 300 | 1.1613 | - |
| 0.5240 | 350 | 1.1229 | - |
| 0.5988 | 400 | 1.1224 | - |
| 0.6737 | 450 | 1.1707 | - |
| 0.7485 | 500 | 1.1398 | - |
| 0.8234 | 550 | 1.1484 | - |
| 0.8982 | 600 | 1.1734 | - |
| 0.9731 | 650 | 1.1669 | - |
| 1.0479 | 700 | 1.0559 | - |
| 1.1228 | 750 | 1.0126 | - |
| 1.1976 | 800 | 0.9651 | - |
| 1.2725 | 850 | 0.9848 | - |
| 1.3473 | 900 | 0.9897 | - |
| 1.4222 | 950 | 0.9773 | - |
| 1.4970 | 1000 | 0.9908 | - |
| 1.5719 | 1050 | 0.9583 | - |
| 1.6467 | 1100 | 0.9986 | - |
| 1.7216 | 1150 | 0.9903 | - |
| 1.7964 | 1200 | 0.9897 | - |
| 1.8713 | 1250 | 0.9681 | - |
| 1.9461 | 1300 | 0.9832 | - |
| 2.0210 | 1350 | 0.9494 | - |
| 2.0958 | 1400 | 0.7348 | - |
| 2.1707 | 1450 | 0.7182 | - |
| 2.2455 | 1500 | 0.739 | - |
| 2.3204 | 1550 | 0.7585 | - |
| 2.3952 | 1600 | 0.726 | - |
| 2.4701 | 1650 | 0.7705 | - |
| 2.5449 | 1700 | 0.776 | - |
| 2.6198 | 1750 | 0.7305 | - |
| 2.6946 | 1800 | 0.7412 | - |
| 2.7695 | 1850 | 0.7758 | - |
| 2.8443 | 1900 | 0.7659 | - |
| 2.9192 | 1950 | 0.7273 | - |
| 2.9940 | 2000 | 0.7207 | - |
| 3.0 | 2004 | - | -0.4247 |
| 0.2994 | 50 | 1.3345 | - |
| 0.5988 | 100 | 0.9648 | - |
| 0.8982 | 150 | 0.8681 | - |
| 1.1976 | 200 | 0.7723 | - |
| 1.4970 | 250 | 0.7426 | - |
| 1.7964 | 300 | 0.7333 | - |
| 2.0958 | 350 | 0.6736 | - |
| 2.3952 | 400 | 0.5491 | - |
| 2.6946 | 450 | 0.5857 | - |
| 2.9940 | 500 | 0.6135 | - |
| 3.0 | 501 | - | -0.6778 |
| 0.2994 | 50 | 0.3463 | - |
| 0.5988 | 100 | 0.03 | - |
| 0.8982 | 150 | 0.0216 | - |
| 1.1976 | 200 | 0.0168 | - |
| 1.4970 | 250 | 0.0157 | - |
| 1.7964 | 300 | 0.017 | - |
| 2.0958 | 350 | 0.0156 | - |
| 2.3952 | 400 | 0.0108 | - |
| 2.6946 | 450 | 0.0136 | - |
| 2.9940 | 500 | 0.0149 | - |
| 3.0 | 501 | - | 0.9061 |
| 0.2994 | 50 | 0.0966 | - |
| 0.5988 | 100 | 0.036 | - |
| 0.8982 | 150 | 0.0263 | - |
| 1.1976 | 200 | 0.02 | - |
| 1.4970 | 250 | 0.0163 | - |
| 1.7964 | 300 | 0.0173 | - |
| 2.0958 | 350 | 0.0149 | - |
| 2.3952 | 400 | 0.0111 | - |
| 2.6946 | 450 | 0.013 | - |
| 2.9940 | 500 | 0.015 | - |
</details>
### Framework Versions
- Python: 3.10.14
- Sentence Transformers: 3.2.1
- Transformers: 4.45.1
- PyTorch: 2.4.0
- Accelerate: 0.34.2
- Datasets: 3.0.1
- Tokenizers: 0.20.0
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | [
"TEXT_CLASSIFICATION",
"SEMANTIC_SIMILARITY",
"TRANSLATION",
"SUMMARIZATION"
]
| [
"CRAFT"
]
| Non_BioNLP |
# SentenceTransformer based on sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2) on the [helpfulness-classification](https://huggingface.co/datasets/jonathanjordan21/helpfulness-classification) dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2) <!-- at revision ae06c001a2546bef168b9bf8f570ccb1a16aaa27 -->
- **Maximum Sequence Length:** 128 tokens
- **Output Dimensionality:** 384 tokens
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- [helpfulness-classification](https://huggingface.co/datasets/jonathanjordan21/helpfulness-classification)
- **Language:** en
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
'<|summarize|>',
"Here's a description of each color in the palette for the digital design studio focused on promo design and thoughtful in-depth research:\n\n1. #0d1925: This is a deep, dark blue color that conveys a sense of trust, stability, and expertise. It's a great choice for a design studio that wants to convey a professional and reliable image.\n\n2. #7391ab: This is a light blue color that suggests a sense of calmness, freshness, and innovation. It's a great choice for a design studio that wants to convey a modern and forward-thinking image.\n\n3. #ffc895: This is a bright, warm yellow color that suggests energy, optimism, and creativity. It's a great choice for a design studio that wants to convey a fun and innovative image.\n\n4. #f6f6f6: This is a light gray color that suggests neutrality, balance, and sophistication. It's a great choice for a design studio that wants to convey a refined and elegant image.\n\n5. #385a76: This is a dark green color that suggests growth, nature, and sustainability. It's a great choice for a design studio that wants to convey a environmentally-conscious and responsible image.\n\nOverall, this color palette is well-rounded and versatile, with colors that can be used in a variety of ways to convey different messages and emotions. The use of deep blues, light blues, bright yellows, and neutral grays makes it a great choice for a design studio that wants to convey a professional, modern, and innovative image.",
'How about "Lingo Love"? This name plays on the idea of using language models to help with dating, suggesting that the app can help you find your perfect match by using sophisticated messaging features. The name also has a playful and romantic feel to it, which is fitting for a dating app.\n\nHere are a few other ideas to consider:\n\n1. "Chatterbox": This name suggests that the app is all about chatting and getting to know each other, which is a key aspect of dating. It also has a friendly and approachable feel to it.\n\n2. "Lingo Luck": This name combines the idea of using language models with the concept of luck in dating, suggesting that the app can help you find love by using advanced messaging features. It also has a playful and romantic feel to it.\n\n3. "Lingo Love Match": This name combines the idea of using language models with the concept of finding your perfect match, suggesting that the app can help you find your ideal partner through sophisticated messaging features. It also has a romantic and serious feel to it.\n\n4. "ChatterLover": This name combines the idea of chatting and getting to know each other with the concept of finding love, suggesting that the app can help you find your perfect match through engaging conversations. It also has a friendly and romantic feel to it.\n\n5. "Lingo Love Story": This name combines the idea of using language models with the concept of finding your perfect match and creating a love story, suggesting that the app can help you find your ideal partner and build a relationship through sophisticated messaging features. It also has a romantic and serious feel to it.\n\nOverall, a name for a dating app that uses a LLMs to create suggested messages should be catchy, memorable, and reflective of the app\'s core values and features. It should also have a tone that is appropriate for the target audience and the type of dating experience the app offers.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Semantic Similarity
* Dataset: `sts-dev`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:------------|
| pearson_cosine | -0.1721 |
| **spearman_cosine** | **-0.1452** |
| pearson_manhattan | -0.1848 |
| spearman_manhattan | -0.2293 |
| pearson_euclidean | -0.2455 |
| spearman_euclidean | -0.2637 |
| pearson_dot | -0.0983 |
| spearman_dot | -0.0765 |
| pearson_max | -0.0983 |
| spearman_max | -0.0765 |
#### Semantic Similarity
* Dataset: `sts-dev`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:------------|
| pearson_cosine | -0.5229 |
| **spearman_cosine** | **-0.4247** |
| pearson_manhattan | 0.0419 |
| spearman_manhattan | 0.0178 |
| pearson_euclidean | -0.0219 |
| spearman_euclidean | -0.0347 |
| pearson_dot | -0.5406 |
| spearman_dot | -0.4245 |
| pearson_max | 0.0419 |
| spearman_max | 0.0178 |
#### Semantic Similarity
* Dataset: `sts-dev`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:------------|
| pearson_cosine | -0.8682 |
| **spearman_cosine** | **-0.6778** |
| pearson_manhattan | -0.8512 |
| spearman_manhattan | -0.6299 |
| pearson_euclidean | -0.8488 |
| spearman_euclidean | -0.6237 |
| pearson_dot | -0.8644 |
| spearman_dot | -0.6508 |
| pearson_max | -0.8488 |
| spearman_max | -0.6237 |
#### Semantic Similarity
* Dataset: `sts-dev`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.9544 |
| **spearman_cosine** | **0.9061** |
| pearson_manhattan | 0.9423 |
| spearman_manhattan | 0.9062 |
| pearson_euclidean | 0.9415 |
| spearman_euclidean | 0.9062 |
| pearson_dot | 0.9535 |
| spearman_dot | 0.9026 |
| pearson_max | 0.9544 |
| spearman_max | 0.9062 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### helpfulness-classification
* Dataset: [helpfulness-classification](https://huggingface.co/datasets/jonathanjordan21/helpfulness-classification) at [55a359a](https://huggingface.co/datasets/jonathanjordan21/helpfulness-classification/tree/55a359a62a79b5087145fb1fbcb1a4f6fa9d1001)
* Size: 21,362 training samples
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 | score |
|:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:---------------------------------------------------------------|
| type | string | string | float |
| details | <ul><li>min: 4 tokens</li><li>mean: 44.54 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 119.41 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.72</li><li>max: 1.0</li></ul> |
* Samples:
| sentence1 | sentence2 | score |
|:-------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------|
| <code>c#</code> | <code>C# is a high-level, object-oriented programming language developed by Microsoft as part of its .NET initiative. It was created as a modern alternative to Java and supports a variety of programming paradigms, including imperative, functional, and event-driven. C# is primarily used for Windows application development, but it can also be used for web, mobile, and game development. The language is designed to be safe, secure, and efficient, and it provides developers with a rich set of libraries and tools for building robust and scalable applications. C# is also widely used in the game development industry, particularly in the development of games for the Xbox 360 and Xbox One consoles.</code> | <code>0.75</code> |
| <code>c#</code> | <code>C# (pronounced "C sharp") is a modern, object-oriented programming language developed by Microsoft. It is widely used for building various types of applications, including web applications, desktop applications, mobile applications, and games. C# is similar to other programming languages such as Java and C++, and it is known for its simplicity and ease of use. C# is a powerful language that provides a rich set of libraries and frameworks that make it easy to build robust and scalable applications.<br><br>Here is a brief overview of some key features of C#:<br><br>1. Object-oriented: C# is an object-oriented language, which means it uses the concept of objects to represent real-world entities and their behavior.<br><br>2. Cross-platform: C# can be used to build applications for multiple platforms, including Windows, macOS, and Linux.<br><br>3. Strongly typed: C# is a strongly typed language, which means that variables must be declared with a specific type, and their type cannot be changed at runtime.<br><br>4. Event-driven: C# uses an event-driven programming model, which means that programs are built around the concept of events, such as user input or network activity.<br><br>5. Garbage-collected: C# has a garbage collector that automatically manages memory allocation and deallocation, making it easier to write memory-efficient and robust applications.<br><br>6. Community-driven: C# has a large and active community of developers, who contribute to the language and its libraries through open-source projects and other initiatives.<br><br>Overall, C# is a versatile and powerful programming language that is widely used for building a variety of applications.</code> | <code>1.0</code> |
| <code>bacillus subtilus</code> | <code>Bacillus subtilis is a Gram-positive, rod-shaped bacterium commonly found in soil and water. It is a facultative anaerobe that can grow in the presence or absence of oxygen. B. subtilis is a model organism for studying bacterial cell biology, genetics, and physiology, and has many industrial and medical applications.<br><br>B. subtilis is a versatile organism with many interesting properties. It can form spores that allow it to survive harsh environmental conditions, such as heat, cold, and dehydration. The spores can remain dormant for long periods of time and then germinate when conditions are favorable. B. subtilis can also produce a variety of enzymes, antibiotics, and other compounds that have industrial and medical applications.<br><br>B. subtilis has been used in the production of enzymes for industrial applications, such as in the food and beverage industry for the production of cheese, yogurt, and beer. It is also used in the production of antibiotics and other pharmaceutical compounds. B. subtilis is also a model organism for studying bacterial cell biology, genetics, and physiology, and has been used extensively in research to study the mechanisms of bacterial growth, development, and survival.<br><br>Overall, B. subtilis is a versatile and important organism with many industrial and medical applications. Its properties and behaviors make it a valuable tool for researchers and industry alike.</code> | <code>1.0</code> |
* Loss: [<code>CosineSimilarityLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosinesimilarityloss) with these parameters:
```json
{
"loss_fct": "torch.nn.modules.loss.MSELoss"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `per_device_train_batch_size`: 64
- `warmup_ratio`: 0.1
- `fp16`: True
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: no
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 64
- `per_device_eval_batch_size`: 8
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 3
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
<details><summary>Click to expand</summary>
| Epoch | Step | Training Loss | sts-dev_spearman_cosine |
|:------:|:----:|:-------------:|:-----------------------:|
| 0.0749 | 50 | 4.9311 | - |
| 0.1497 | 100 | 4.8825 | - |
| 0.2246 | 150 | 4.7368 | - |
| 0.2994 | 200 | 4.519 | - |
| 0.3743 | 250 | 4.3786 | - |
| 0.4491 | 300 | 4.3008 | - |
| 0.5240 | 350 | 4.2746 | - |
| 0.5988 | 400 | 4.2331 | - |
| 0.6737 | 450 | 4.2043 | - |
| 0.7485 | 500 | 4.324 | - |
| 0.8234 | 550 | 4.5276 | - |
| 0.8982 | 600 | 4.379 | - |
| 0.0749 | 50 | 1.4284 | - |
| 0.1497 | 100 | 1.3783 | - |
| 0.2246 | 150 | 1.3934 | - |
| 0.2994 | 200 | 1.3786 | - |
| 0.3743 | 250 | 1.4103 | - |
| 0.4491 | 300 | 1.3666 | - |
| 0.5240 | 350 | 1.3735 | - |
| 0.5988 | 400 | 1.3667 | - |
| 0.6737 | 450 | 1.3393 | - |
| 0.7485 | 500 | 1.3432 | - |
| 0.8234 | 550 | 1.3696 | - |
| 0.8982 | 600 | 1.3582 | - |
| 0.9731 | 650 | 1.3573 | - |
| 1.0479 | 700 | 1.3204 | - |
| 1.1228 | 750 | 1.3347 | - |
| 1.1976 | 800 | 1.3104 | - |
| 1.2725 | 850 | 1.3162 | - |
| 1.3473 | 900 | 1.2872 | - |
| 1.4222 | 950 | 1.2728 | - |
| 1.4970 | 1000 | 1.3025 | - |
| 1.5719 | 1050 | 1.2827 | - |
| 1.6467 | 1100 | 1.3142 | - |
| 1.7216 | 1150 | 1.2892 | - |
| 1.7964 | 1200 | 1.2861 | - |
| 1.8713 | 1250 | 1.2743 | - |
| 1.9461 | 1300 | 1.2918 | - |
| 2.0210 | 1350 | 1.2937 | - |
| 2.0958 | 1400 | 1.1952 | - |
| 2.1707 | 1450 | 1.1722 | - |
| 2.2455 | 1500 | 1.2149 | - |
| 2.3204 | 1550 | 1.2037 | - |
| 2.3952 | 1600 | 1.1624 | - |
| 2.4701 | 1650 | 1.1731 | - |
| 2.5449 | 1700 | 1.1903 | - |
| 2.6198 | 1750 | 1.1569 | - |
| 2.6946 | 1800 | 1.164 | - |
| 2.7695 | 1850 | 1.1744 | - |
| 2.8443 | 1900 | 1.1595 | - |
| 2.9192 | 1950 | 1.1505 | - |
| 2.9940 | 2000 | 1.1174 | - |
| 3.0 | 2004 | - | -0.1452 |
| 0.0749 | 50 | 1.1597 | - |
| 0.1497 | 100 | 1.1321 | - |
| 0.2246 | 150 | 1.176 | - |
| 0.2994 | 200 | 1.1641 | - |
| 0.3743 | 250 | 1.1781 | - |
| 0.4491 | 300 | 1.1613 | - |
| 0.5240 | 350 | 1.1229 | - |
| 0.5988 | 400 | 1.1224 | - |
| 0.6737 | 450 | 1.1707 | - |
| 0.7485 | 500 | 1.1398 | - |
| 0.8234 | 550 | 1.1484 | - |
| 0.8982 | 600 | 1.1734 | - |
| 0.9731 | 650 | 1.1669 | - |
| 1.0479 | 700 | 1.0559 | - |
| 1.1228 | 750 | 1.0126 | - |
| 1.1976 | 800 | 0.9651 | - |
| 1.2725 | 850 | 0.9848 | - |
| 1.3473 | 900 | 0.9897 | - |
| 1.4222 | 950 | 0.9773 | - |
| 1.4970 | 1000 | 0.9908 | - |
| 1.5719 | 1050 | 0.9583 | - |
| 1.6467 | 1100 | 0.9986 | - |
| 1.7216 | 1150 | 0.9903 | - |
| 1.7964 | 1200 | 0.9897 | - |
| 1.8713 | 1250 | 0.9681 | - |
| 1.9461 | 1300 | 0.9832 | - |
| 2.0210 | 1350 | 0.9494 | - |
| 2.0958 | 1400 | 0.7348 | - |
| 2.1707 | 1450 | 0.7182 | - |
| 2.2455 | 1500 | 0.739 | - |
| 2.3204 | 1550 | 0.7585 | - |
| 2.3952 | 1600 | 0.726 | - |
| 2.4701 | 1650 | 0.7705 | - |
| 2.5449 | 1700 | 0.776 | - |
| 2.6198 | 1750 | 0.7305 | - |
| 2.6946 | 1800 | 0.7412 | - |
| 2.7695 | 1850 | 0.7758 | - |
| 2.8443 | 1900 | 0.7659 | - |
| 2.9192 | 1950 | 0.7273 | - |
| 2.9940 | 2000 | 0.7207 | - |
| 3.0 | 2004 | - | -0.4247 |
| 0.2994 | 50 | 1.3345 | - |
| 0.5988 | 100 | 0.9648 | - |
| 0.8982 | 150 | 0.8681 | - |
| 1.1976 | 200 | 0.7723 | - |
| 1.4970 | 250 | 0.7426 | - |
| 1.7964 | 300 | 0.7333 | - |
| 2.0958 | 350 | 0.6736 | - |
| 2.3952 | 400 | 0.5491 | - |
| 2.6946 | 450 | 0.5857 | - |
| 2.9940 | 500 | 0.6135 | - |
| 3.0 | 501 | - | -0.6778 |
| 0.2994 | 50 | 0.3463 | - |
| 0.5988 | 100 | 0.03 | - |
| 0.8982 | 150 | 0.0216 | - |
| 1.1976 | 200 | 0.0168 | - |
| 1.4970 | 250 | 0.0157 | - |
| 1.7964 | 300 | 0.017 | - |
| 2.0958 | 350 | 0.0156 | - |
| 2.3952 | 400 | 0.0108 | - |
| 2.6946 | 450 | 0.0136 | - |
| 2.9940 | 500 | 0.0149 | - |
| 3.0 | 501 | - | 0.9061 |
| 0.2994 | 50 | 0.0966 | - |
| 0.5988 | 100 | 0.036 | - |
| 0.8982 | 150 | 0.0263 | - |
| 1.1976 | 200 | 0.02 | - |
| 1.4970 | 250 | 0.0163 | - |
| 1.7964 | 300 | 0.0173 | - |
| 2.0958 | 350 | 0.0149 | - |
| 2.3952 | 400 | 0.0111 | - |
| 2.6946 | 450 | 0.013 | - |
| 2.9940 | 500 | 0.015 | - |
</details>
### Framework Versions
- Python: 3.10.14
- Sentence Transformers: 3.2.1
- Transformers: 4.45.1
- PyTorch: 2.4.0
- Accelerate: 0.34.2
- Datasets: 3.0.1
- Tokenizers: 0.20.0
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | {"base_model": "sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2", "datasets": ["jonathanjordan21/helpfulness-classification"], "language": ["en"], "library_name": "sentence-transformers", "metrics": ["pearson_cosine", "spearman_cosine", "pearson_manhattan", "spearman_manhattan", "pearson_euclidean", "spearman_euclidean", "pearson_dot", "spearman_dot", "pearson_max", "spearman_max"], "pipeline_tag": "sentence-similarity", "tags": ["sentence-transformers", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:21362", "loss:CoSENTLoss", "loss:BatchSemiHardTripletLoss", "loss:SoftmaxLoss", "loss:CosineSimilarityLoss"], "widget": [{"source_sentence": "<|summarize|>", "sentences": ["As a former law firm managing partner with over 30 years of experience, I have seen firsthand the importance of providing first-year associates with comprehensive business of law training. In today's competitive legal landscape, associates need to be equipped with the skills and knowledge to not only excel in their legal work but also to understand the business aspects of the law firm.\n\nOne of the key reasons for providing business of law training to first-year associates is to help them understand the firm's overall business strategy. Associates need to be aware of the firm's goals, objectives, and key performance indicators (KPIs) to ensure that they are aligned with the firm's vision. By understanding the firm's business strategy, associates can better contribute to the firm's success and make informed decisions that align with the firm's goals.\n\nAnother important aspect of business of law training is to help associates understand the firm's financial operations. Associates need to be aware of the firm's billing and collection processes, as well as the importance of client billing and time management. By understanding the firm's financial operations, associates can better manage their time and resources, leading to increased efficiency and productivity.\n\nIn addition to understanding the firm's business strategy and financial operations, associates also need to be equipped with the skills to manage client relationships. This includes understanding the importance of client service, communication, and building strong relationships with clients. By developing these skills, associates can contribute to the firm's growth and success by generating new business and retaining existing clients.\n\nWhile some law firms may argue that business of law training is not necessary for first-year associates, as they are still learning the basics of the law, this approach can be detrimental to the firm's success in the long run. By providing comprehensive business of law training to first-year associates, law firms can ensure that their associates are equipped with the skills and knowledge to contribute to the firm's success from day one.\n\nOne example of a law firm that has successfully implemented business of law training for first-year associates is Jones Day. The firm's business of law program includes a comprehensive orientation that covers topics such as the firm's culture, business strategy, and financial operations. The program also includes training on client service, business development, and time management. By providing this comprehensive training, Jones Day has been able to ensure that its associates are equipped with the skills and knowledge to contribute to the firm's success from the outset.\n\nAnother example of a law firm that has successfully implemented business of law training is Skadden, Arps, Slate, Meagher & Flom LLP. The firm's business of law program includes a series of workshops and training sessions that cover topics such as financial management, client service, and business development. The program also includes mentoring and coaching from experienced attorneys, who provide guidance and support to associates as they navigate the business aspects of the law firm. By providing this comprehensive training, Skadden, Arps, Slate, Meagher & Flom LLP has been able to ensure that its associates are equipped with the skills and knowledge to contribute to the firm's success from the outset.\n\nIn conclusion, providing first-year associates with comprehensive business of law training is essential for the success of any law firm. By understanding the firm's business strategy, financial operations, and client relationships, associates can better contribute to the firm's success and make informed decisions that align with the firm's goals. Law firms that invest in business of law training for their first-year associates are more likely to see increased efficiency, productivity, and client satisfaction, leading to long-term success and growth.", "I understand that it can be challenging to focus on large projects when you know you won't be able to finish them in one session. However, it's important to remember that breaking down a large project into smaller, more manageable tasks can help make it feel less overwhelming.\n\nHere are some tips that might help you:\n\n1. Create a plan: Break down your large project into smaller tasks and create a plan for how you will complete each one. This can help you feel more in control and less overwhelmed.\n\n2. Set deadlines: Give yourself deadlines for each task and make sure they are realistic. This can help you stay on track and avoid procrastinating.\n\n3. Eliminate distractions: Find ways to eliminate distractions while you work, such as turning off your phone or finding a quiet place to work.\n\n4. Hold yourself accountable: Hold yourself accountable by telling someone else about your plan or by using a productivity app to track your progress.\n\n5. Take breaks: It's important to take breaks and give yourself time to recharge. This can actually help you be more productive in the long run.\n\nAs for starting conversations with me and stopping midway, it's fine to do that if you have new thoughts or questions to ask. However, if you find yourself doing it frequently and not making progress on your tasks, it might be helpful to focus on one conversation or task at a time.\n\nRemember, developing good habits takes time and practice. Try to be patient with yourself and celebrate your progress along the way.", "\"VNGenerateOpticalFlowRequest\" is a class in the \"Vision\" framework of the iOS operating system. It is used to request the generation of optical flow data for an image. Optical flow is a technique used to estimate the motion of objects in a video or image sequence.\n\nThe VNGenerateOpticalFlowRequest class is a request object that represents a request to generate optical flow data for an image. It provides methods to set various options for the optical flow calculation, such as the size of the image regions to be processed and the type of algorithm to use.\n\nOnce the request has been processed, the resulting optical flow data can be accessed through the \"output\" property of the request object. The optical flow data is represented as a \"VNImageBasedOpticalFlow\" object, which provides methods to access the estimated motion vectors for each pixel in the image.\n\nOverall, the VNGenerateOpticalFlowRequest class is a powerful tool for analyzing and understanding the motion of objects in images, and can be used for a variety of applications such as object tracking, motion analysis, and video stabilization."]}, {"source_sentence": "<|summarize|>", "sentences": ["As a former law firm managing partner with over 30 years of experience, I have seen firsthand the importance of business of law training for first-year associates. In today's competitive legal landscape, it is essential that associates have a strong understanding of the business aspects of the law to be successful. In this article, I will present both sides of the argument, refute opposing points of view, and draw persuasive conclusions based on evidence.\n\nFirst and foremost, it is important to recognize that the legal profession has changed significantly in recent years. Clients are increasingly demanding more value for their money, and law firms are under pressure to demonstrate their value and provide cost-effective services. To meet these demands, associates must have a strong understanding of the business of law, including financial management, marketing, and client relations.\n\nOne argument against business of law training for first-year associates is that they should focus solely on developing their legal skills and knowledge. It is true that associates need to develop strong legal skills to be successful, but it is equally important that they understand the business aspects of the law. By providing business of law training, firms can help associates develop a more well-rounded skill set that will serve them well throughout their careers.\n\nAnother argument against business of law training is that it can be time-consuming and take away from associates' legal work. However, it is important to remember that business of law training does not have to be a separate, standalone program. Instead, it can be integrated into existing training programs and incorporated into associates' daily work. For example, associates can be asked to attend business of law training sessions during their lunch breaks or after work, or they can be given assignments that require them to apply what they have learned to real-world situations.\n\nIn addition to providing business of law training, firms should also focus on creating a culture that values and rewards business development efforts. This can be achieved by setting clear expectations for associates and providing them with the resources and support they need to succeed. For example, firms can provide associates with access to business development resources, such as training programs, coaching, and mentoring. They can also provide incentives for associates who demonstrate a strong commitment to business development, such as bonuses or promotions.\n\nIn conclusion, it is clear that business of law training is essential for first-year associates in today's legal landscape. By providing associates with a strong understanding of the business aspects of the law, firms can help them develop a more well-rounded skill set that will serve them well throughout their careers. Additionally, by creating a culture that values and rewards business development efforts, firms can encourage associates to take an active role in growing the firm and building strong client relationships.", "Here are 20 jobs that are unlikely to be fully replaced by AI in the near future due to their reliance on human creativity, empathy, and complex problem-solving skills:\n\n1. Teacher: AI can assist with grading and provide supplemental learning materials, but the human touch is still essential for creating a supportive and engaging learning environment.\n\n2. Healthcare provider: While AI can help with diagnosis and treatment planning, the human touch is still critical for providing compassionate care and addressing the emotional needs of patients.\n\n3. Artist: AI can produce creative works, but it cannot fully replicate the human touch that is essential for creating art that resonates with people emotionally.\n\n4. Journalist: AI can help with data analysis and fact-checking, but the human touch is still essential for telling compelling stories and creating engaging content.\n\n5. Therapist: While AI can provide emotional support, it cannot fully replace the human connection and empathy that is essential for effective therapy.\n\n6. Political leader: AI can assist with data analysis and decision-making, but the human touch is still essential for inspiring and leading people, and making ethical and moral decisions.\n\n7. Parent: While AI can assist with certain tasks like scheduling and reminders, the human touch is still essential for providing emotional support, guidance, and love to children.\n\n8. Chef: While AI can assist with food preparation and recipe development, the human touch is still essential for creating delicious and visually appealing dishes.\n\n9. Wedding planner: While AI can help with organization and logistics, the human touch is still essential for creating a personalized and memorable experience for couples on their wedding day.\n\n10. Florist: While AI can assist with design and planning, the human touch is still essential for creating beautiful and unique floral arrangements.\n\n11. Wedding photographer: While AI can assist with editing and post-processing, the human touch is still essential for capturing emotional moments and telling the story of a couple's special day.\n\n12. Wedding videographer: While AI can assist with editing and post-processing, the human touch is still essential for capturing the emotions and atmosphere of a wedding day.\n\n13. Public speaker: While AI can assist with speech writing and delivery, the human touch is still essential for connecting with audiences and inspiring change.\n\n14. Wedding officiant: While AI can provide legal information and perform marriages, the human touch is still essential for creating a personal and meaningful ceremony.\n\n15. Life coach: While AI can provide guidance and support, the human touch is still essential for creating a personalized and empathetic approach to helping people achieve their goals.\n\n16. Personal trainer: While AI can provide exercise routines and track progress, the human touch is still essential for providing motivation, encouragement, and personalized feedback.\n\n17. Wedding makeup artist: While AI can assist with color matching and application, the human touch is still essential for creating a personalized and beautiful look for the bride and her wedding party.\n\n18. Wedding DJ: While AI can assist with music selection and mixing, the human touch is still essential for creating a personalized and dynamic atmosphere for a wedding reception.\n\n19. Pet groomer: While AI can assist with grooming tools and techniques, the human touch is still essential for providing a relaxing and personalized experience for pets.\n\n20. Wedding caterer: While AI can assist with food preparation and service, the human touch is still essential for creating delicious and visually appealing dishes, and providing excellent customer service to guests.\n\nThese jobs require a combination of technical skills, creativity, empathy, and complex problem-solving abilities that are difficult for AI to replicate. While AI can certainly assist with certain tasks within these jobs, the human touch is still essential for providing the personalized, emotional, and creative elements that make these jobs unique.", "Oh, boy! Where do I begin? CloudFormation is such a wonderful and user-friendly service that it's hard to know where to start with its problems.\n\nFirst of all, let's talk about the templates. Oh, those templates! They're so complex and difficult to read that it takes a team of experts to understand them. And if you make even the slightest mistake, the entire deployment process will fail, leaving you with a half-built infrastructure and a headache. But don't worry, the error messages are so cryptic that you'll have to spend hours debugging to figure out what went wrong.\n\nAnd speaking of deployment, let's talk about the deployment process itself. It's so slow and clunky that you'll want to tear your hair out. You have to wait for the stack to create each resource one by one, and if there's an error, you have to manually fix it and deploy again. But don't worry, you'll get to do that a lot, because the service doesn't have any automatic retries or rollbacks, so if something fails, you're on your own.\n\nAnd let's not forget about the pricing. Oh, the pricing! It's so confusing that you'll need a degree in finance to understand it. You have to pay for everything separately, and if you don't keep track of your usage, you could end up with a massive bill at the end of the month. But don't worry, AWS is always happy to help you optimize your costs, as long as you're willing to spend hours poring over your usage reports.\n\nAnd finally, let's talk about the support. Oh, the support! It's so bad that you'll be lucky to get a response within a week. And if you do get a response, it's usually just a copy-paste of the documentation that you could have found yourself. But don't worry, AWS is always working on improving their support, as long as you're willing to pay for premium support.\n\nIn conclusion, CloudFormation is a wonderful service that is complex, slow, expensive, and has terrible support. But hey, it's not like there are any alternatives, right?"]}, {"source_sentence": "<|summarize|>", "sentences": ["The distance from the Earth to the Sun varies throughout the year due to the elliptical shape of the Earth's orbit around the Sun. The average distance from the Earth to the Sun is approximately 93 million miles (150 million kilometers).\n\nHowever, the actual distance between the Earth and the Sun can range from approximately 91.4 million miles (147.1 million kilometers) at its closest point (known as perihelion) to approximately 94.5 million miles (152.1 million kilometers) at its farthest point (known as aphelion). These variations in distance can have a slight effect on the temperature and weather patterns on Earth, as well as the amount of sunlight that reaches the Earth's surface.\n\nIt is important to note that the distance from the Earth to the Sun is also affected by other factors, such as the gravitational interactions between the Earth, the Sun, and other planets in the solar system. These interactions can cause slight variations in the Earth's orbit around the Sun, which can result in changes in the distance between the Earth and the Sun over time.\n\nIn general, the distance from the Earth to the Sun is a crucial factor in determining the climate and weather patterns on Earth, and it is closely monitored by scientists and astronomers who study the solar system and its effects on our planet.", "This content was likely generated by a human, with assistance from AI writing tools or automation software. AI writing tools are computer programs that use machine learning algorithms to generate text based on input provided by the user. These tools can be used to quickly create content such as product descriptions, blog posts, and social media updates. Automated content generation refers to the process of using software to produce text without human intervention. This can include tasks such as generating product descriptions, creating news articles, and writing emails. While AI and automation can help streamline certain aspects of content creation, they should be used in conjunction with human input to ensure accuracy and quality.", "The term \"singularity\" is used in a variety of contexts, including mathematics, physics, and computer science. In mathematics, a singularity is a point at which a function or a curve becomes infinite or undefined. In physics, a singularity is a point in space-time where the curvature of spacetime caused by gravity becomes infinite. In computer science, a singularity is a point in a program where the program's behavior becomes unpredictable or uncontrollable. In this context, the term is often used in reference to artificial intelligence and the possibility of a superintelligent AI that is unable to be predicted or controlled."]}, {"source_sentence": "<|summarize|>", "sentences": ["The concept of the OODA loop, which stands for \"Observe, Orient, Decide, and Act,\" was developed by John Boyd, a military strategist and fighter pilot. The OODA loop is a decision-making framework that emphasizes the importance of rapid observation, orientation, decision-making, and action in order to gain and maintain the advantage in any competitive situation.\n\nThe reason why the OODA loop is considered to be so powerful is that it provides a structured approach to decision-making that can be applied in a wide variety of situations, from military operations to business strategy to personal life. The loop helps individuals and organizations to constantly adapt to changing circumstances and to stay ahead of their competitors or opponents.\n\nBy continuously observing their environment, orienting themselves to the situation, making quick decisions, and taking action, individuals and organizations can gain a competitive advantage and achieve their goals more effectively. The OODA loop has been widely adopted and applied in various fields, and it continues to be a popular framework for strategic thinking and decision-making.", "My fellow Americans,\n\nToday, I stand before you to discuss the greatness of music and the composers who have shaped our cultural landscape. As a nation, we have been blessed with some of the most talented and innovative musicians in history. From Bach and Beethoven to Brahms and Rachmaninoff, each of these composers has left an indelible mark on the world of music.\n\nBach and Beethoven are two of the most celebrated composers of all time. Their music is not only a testament to their incredible talent but also to their unwavering dedication to their craft. Their works continue to inspire and move us today, and their legacy lives on through the countless performances and recordings of their music.\n\nOn the other hand, Mozart has often been criticized for his lack of depth and substance in his music. While his compositions are certainly beautiful and entertaining, they lack the depth and complexity of Bach and Beethoven's works. Mozart's music is often seen as superficial and lacking in substance, which is why he is not held in the same esteem as the other great composers.\n\nFelix Mendelssohn Bartholdy is a composer who has often been overlooked, despite his incredible talent and contributions to the world of music. Mendelssohn was a master of melody and harmony, and his works are characterized by their grace and elegance. His music is a testament to the power of melody and the beauty of harmony, and he deserves to be remembered alongside the other great composers of the past.\n\nFinally, I would be remiss if I did not mention Brahms and Rachmaninoff, two of the most beloved composers of the 19th and 20th centuries. Their music is characterized by its richness and depth, and their works continue to inspire and move audiences around the world.\n\nIn conclusion, music is an essential part of our cultural heritage, and the composers who have shaped our musical landscape deserve to be remembered and celebrated. Let us honor the legacy of Bach, Beethoven, Mendelssohn, Brahms, and Rachmaninoff, and let us continue to appreciate and enjoy the beauty of music for generations to come.", "The term \"G.O.A.T.\" (Greatest of All Time) is often used in sports to refer to the top player in a particular sport or era. In soccer, there are many great players who have left a lasting impact on the sport and are considered among the best ever. Here are a few of the top contenders for the title of G.O.A.T. in soccer:\n\n1. Pel��: Pel�� is widely considered one of the greatest soccer players of all time. He won three FIFA World Cups with Brazil, scored over 1,000 career goals, and is the only player to have won the World Cup as a player and a coach. Pel�� is known for his exceptional technical ability, vision, and goal-scoring prowess, and is often referred to as the \"King of Soccer.\"\n\n2. Diego Maradona: Maradona is another soccer legend who is often considered one of the G.O.A.T. candidates. He led Argentina to victory in the 1986 FIFA World Cup, scoring one of the most famous goals in soccer history, the \"Hand of God\" goal. Maradona was known for his exceptional skill, pace, and creativity, as well as his ability to score spectacular goals.\n\n3. Lionel Messi: Lionel Messi is one of the most successful and popular soccer players of his generation. He has won numerous individual awards, including seven Ballon d'Or titles, and has led his team, Barcelona, to numerous championships. Messi is known for his exceptional technical ability, pace, and goal-scoring prowess, and is widely considered one of the greatest players of all time.\n\n4. Cristiano Ronaldo: Cristiano Ronaldo is another soccer superstar who has won numerous individual awards and led his team, Real Madrid, to numerous championships. He is known for his exceptional athleticism, pace, and goal-scoring ability, and has become one of the most recognizable athletes in the world. Ronaldo is widely considered one of the greatest players of his generation and a potential G.O.A.T. candidate.\n\n5. Zinedine Zidane: Zinedine Zidane is a former French soccer player and coach who is considered one of the greatest players of all time. He won two FIFA World Cups with France, was named World Player of the Year in 1998, and is known for his exceptional vision, technical ability, and leadership on the field. Zidane is often mentioned as a potential G.O.A.T. candidate.\n\nThese are just a few of the many great soccer players who have contributed to the sport's history and are often considered among the greatest of all time. The debate over who the G.O.A.T. in soccer is will likely continue for years to come."]}, {"source_sentence": "<|summarize|>", "sentences": ["Here's a description of each color in the palette for the digital design studio focused on promo design and thoughtful in-depth research:\n\n1. #0d1925: This is a deep, dark blue color that conveys a sense of trust, stability, and expertise. It's a great choice for a design studio that wants to convey a professional and reliable image.\n\n2. #7391ab: This is a light blue color that suggests a sense of calmness, freshness, and innovation. It's a great choice for a design studio that wants to convey a modern and forward-thinking image.\n\n3. #ffc895: This is a bright, warm yellow color that suggests energy, optimism, and creativity. It's a great choice for a design studio that wants to convey a fun and innovative image.\n\n4. #f6f6f6: This is a light gray color that suggests neutrality, balance, and sophistication. It's a great choice for a design studio that wants to convey a refined and elegant image.\n\n5. #385a76: This is a dark green color that suggests growth, nature, and sustainability. It's a great choice for a design studio that wants to convey a environmentally-conscious and responsible image.\n\nOverall, this color palette is well-rounded and versatile, with colors that can be used in a variety of ways to convey different messages and emotions. The use of deep blues, light blues, bright yellows, and neutral grays makes it a great choice for a design studio that wants to convey a professional, modern, and innovative image.", "How about \"Lingo Love\"? This name plays on the idea of using language models to help with dating, suggesting that the app can help you find your perfect match by using sophisticated messaging features. The name also has a playful and romantic feel to it, which is fitting for a dating app.\n\nHere are a few other ideas to consider:\n\n1. \"Chatterbox\": This name suggests that the app is all about chatting and getting to know each other, which is a key aspect of dating. It also has a friendly and approachable feel to it.\n\n2. \"Lingo Luck\": This name combines the idea of using language models with the concept of luck in dating, suggesting that the app can help you find love by using advanced messaging features. It also has a playful and romantic feel to it.\n\n3. \"Lingo Love Match\": This name combines the idea of using language models with the concept of finding your perfect match, suggesting that the app can help you find your ideal partner through sophisticated messaging features. It also has a romantic and serious feel to it.\n\n4. \"ChatterLover\": This name combines the idea of chatting and getting to know each other with the concept of finding love, suggesting that the app can help you find your perfect match through engaging conversations. It also has a friendly and romantic feel to it.\n\n5. \"Lingo Love Story\": This name combines the idea of using language models with the concept of finding your perfect match and creating a love story, suggesting that the app can help you find your ideal partner and build a relationship through sophisticated messaging features. It also has a romantic and serious feel to it.\n\nOverall, a name for a dating app that uses a LLMs to create suggested messages should be catchy, memorable, and reflective of the app's core values and features. It should also have a tone that is appropriate for the target audience and the type of dating experience the app offers.", "Here are some potential uses cases for large language models (LLMs) in investigative journalism:\n\n1. Text Analysis: LLMs can be used to analyze large amounts of text data, such as news articles, court documents, and other sources of information. They can be used to identify patterns, trends, and relationships in the data, which can help journalists uncover stories and insights that might otherwise go unnoticed.\n\n2. Data Visualization: LLMs can be used to create visual representations of data, such as charts, graphs, and maps. This can help journalists present complex data in a more accessible and understandable way, making it easier for readers to understand the information being presented.\n\n3. Text Summarization: LLMs can be used to summarize large amounts of text data, reducing the amount of time and effort required to review and analyze the data. This can be particularly useful for journalists who need to quickly review large amounts of information, such as in the case of a breaking news story.\n\n4. Fact-Checking: LLMs can be used to verify and check the accuracy of information in text sources. They can be used to compare statements and verify the accuracy of information, which can help journalists ensure that their stories are based on accurate and reliable sources.\n\n5. Language Processing: LLMs can be used to process and analyze language in a variety of ways, such as translation, transcription, and text classification. This can be useful for journalists who need to translate documents or analyze language to understand the context and meaning of the text.\n\n6. Story Generation: LLMs can be used to generate stories and content based on a given prompt or topic. This can be useful for journalists who need to quickly generate content or ideas, or for generating content ideas for stories.\n\nOverall, LLMs can be a powerful tool for investigative journalists, helping them to quickly analyze and make sense of large amounts of text data, and to generate insights and stories that might otherwise go unnoticed."]}], "model-index": [{"name": "SentenceTransformer based on sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2", "results": [{"task": {"type": "semantic-similarity", "name": "Semantic Similarity"}, "dataset": {"name": "sts dev", "type": "sts-dev"}, "metrics": [{"type": "pearson_cosine", "value": -0.17209387421860306, "name": "Pearson Cosine"}, {"type": "spearman_cosine", "value": -0.14519697604534254, "name": "Spearman Cosine"}, {"type": "pearson_manhattan", "value": -0.18478684918865068, "name": "Pearson Manhattan"}, {"type": "spearman_manhattan", "value": -0.22934609512092033, "name": "Spearman Manhattan"}, {"type": "pearson_euclidean", "value": -0.24554019485789957, "name": "Pearson Euclidean"}, {"type": "spearman_euclidean", "value": -0.2636925680131005, "name": "Spearman Euclidean"}, {"type": "pearson_dot", "value": -0.09827403403830653, "name": "Pearson Dot"}, {"type": "spearman_dot", "value": -0.07652978034449803, "name": "Spearman Dot"}, {"type": "pearson_max", "value": -0.09827403403830653, "name": "Pearson Max"}, {"type": "spearman_max", "value": -0.07652978034449803, "name": "Spearman Max"}, {"type": "pearson_cosine", "value": -0.5228815388202983, "name": "Pearson Cosine"}, {"type": "spearman_cosine", "value": -0.42466509615002906, "name": "Spearman Cosine"}, {"type": "pearson_manhattan", "value": 0.041871234564333504, "name": "Pearson Manhattan"}, {"type": "spearman_manhattan", "value": 0.01779323694411108, "name": "Spearman Manhattan"}, {"type": "pearson_euclidean", "value": -0.02187961676451103, "name": "Pearson Euclidean"}, {"type": "spearman_euclidean", "value": -0.034711877576677826, "name": "Spearman Euclidean"}, {"type": "pearson_dot", "value": -0.5406291665961442, "name": "Pearson Dot"}, {"type": "spearman_dot", "value": -0.42445765589990675, "name": "Spearman Dot"}, {"type": "pearson_max", "value": 0.041871234564333504, "name": "Pearson Max"}, {"type": "spearman_max", "value": 0.01779323694411108, "name": "Spearman Max"}, {"type": "pearson_cosine", "value": -0.868186555898593, "name": "Pearson Cosine"}, {"type": "spearman_cosine", "value": -0.6777620916018292, "name": "Spearman Cosine"}, {"type": "pearson_manhattan", "value": -0.8512368403264938, "name": "Pearson Manhattan"}, {"type": "spearman_manhattan", "value": -0.6299165589119777, "name": "Spearman Manhattan"}, {"type": "pearson_euclidean", "value": -0.8487518713213003, "name": "Pearson Euclidean"}, {"type": "spearman_euclidean", "value": -0.6237022202033926, "name": "Spearman Euclidean"}, {"type": "pearson_dot", "value": -0.8643809390831493, "name": "Pearson Dot"}, {"type": "spearman_dot", "value": -0.6508029354917555, "name": "Spearman Dot"}, {"type": "pearson_max", "value": -0.8487518713213003, "name": "Pearson Max"}, {"type": "spearman_max", "value": -0.6237022202033926, "name": "Spearman Max"}, {"type": "pearson_cosine", "value": 0.9544094126053565, "name": "Pearson Cosine"}, {"type": "spearman_cosine", "value": 0.9060595979711947, "name": "Spearman Cosine"}, {"type": "pearson_manhattan", "value": 0.942315396362075, "name": "Pearson Manhattan"}, {"type": "spearman_manhattan", "value": 0.9061702233866991, "name": "Spearman Manhattan"}, {"type": "pearson_euclidean", "value": 0.941528689832946, "name": "Pearson Euclidean"}, {"type": "spearman_euclidean", "value": 0.9061945563550459, "name": "Spearman Euclidean"}, {"type": "pearson_dot", "value": 0.9534770056190236, "name": "Pearson Dot"}, {"type": "spearman_dot", "value": 0.9026146734829041, "name": "Spearman Dot"}, {"type": "pearson_max", "value": 0.9544094126053565, "name": "Pearson Max"}, {"type": "spearman_max", "value": 0.9061945563550459, "name": "Spearman Max"}]}]}]} |
zeroMN/SHMT | zeroMN | audio-text-to-text | [
"transformers",
"transformer",
"multimodal",
"vqa",
"text",
"audio",
"audio-text-to-text",
"en",
"zh",
"dataset:zeroMN/nlp_corpus_zh",
"dataset:zeroMN/hanlp_date-zh",
"dataset:nyu-mll/glue",
"dataset:aps/super_glue",
"dataset:facebook/anli",
"dataset:tasksource/babi_nli",
"dataset:zeroMN/AVEdate",
"dataset:sick",
"dataset:snli",
"dataset:scitail",
"dataset:hans",
"dataset:alisawuffles/WANLI",
"dataset:tasksource/recast",
"dataset:sileod/probability_words_nli",
"dataset:joey234/nan-nli",
"dataset:pietrolesci/nli_fever",
"dataset:pietrolesci/breaking_nli",
"dataset:pietrolesci/conj_nli",
"dataset:pietrolesci/fracas",
"dataset:pietrolesci/dialogue_nli",
"dataset:pietrolesci/mpe",
"dataset:pietrolesci/dnc",
"dataset:pietrolesci/recast_white",
"dataset:pietrolesci/joci",
"dataset:pietrolesci/robust_nli",
"dataset:pietrolesci/robust_nli_is_sd",
"dataset:pietrolesci/robust_nli_li_ts",
"dataset:pietrolesci/gen_debiased_nli",
"dataset:pietrolesci/add_one_rte",
"dataset:tasksource/imppres",
"dataset:hlgd",
"dataset:paws",
"dataset:medical_questions_pairs",
"dataset:Anthropic/model-written-evals",
"dataset:truthful_qa",
"dataset:nightingal3/fig-qa",
"dataset:tasksource/bigbench",
"dataset:blimp",
"dataset:cos_e",
"dataset:cosmos_qa",
"dataset:dream",
"dataset:openbookqa",
"dataset:qasc",
"dataset:quartz",
"dataset:quail",
"dataset:head_qa",
"dataset:sciq",
"dataset:social_i_qa",
"dataset:wiki_hop",
"dataset:wiqa",
"dataset:piqa",
"dataset:hellaswag",
"dataset:pkavumba/balanced-copa",
"dataset:12ml/e-CARE",
"dataset:art",
"dataset:winogrande",
"dataset:codah",
"dataset:ai2_arc",
"dataset:definite_pronoun_resolution",
"dataset:swag",
"dataset:math_qa",
"dataset:metaeval/utilitarianism",
"dataset:mteb/amazon_counterfactual",
"dataset:SetFit/insincere-questions",
"dataset:SetFit/toxic_conversations",
"dataset:turingbench/TuringBench",
"dataset:trec",
"dataset:tals/vitaminc",
"dataset:hope_edi",
"dataset:strombergnlp/rumoureval_2019",
"dataset:ethos",
"dataset:tweet_eval",
"dataset:discovery",
"dataset:pragmeval",
"dataset:silicone",
"dataset:lex_glue",
"dataset:papluca/language-identification",
"dataset:imdb",
"dataset:rotten_tomatoes",
"dataset:ag_news",
"dataset:yelp_review_full",
"dataset:financial_phrasebank",
"dataset:poem_sentiment",
"dataset:dbpedia_14",
"dataset:amazon_polarity",
"dataset:app_reviews",
"dataset:hate_speech18",
"dataset:sms_spam",
"dataset:humicroedit",
"dataset:snips_built_in_intents",
"dataset:hate_speech_offensive",
"dataset:yahoo_answers_topics",
"dataset:pacovaldez/stackoverflow-questions",
"dataset:zapsdcn/hyperpartisan_news",
"dataset:zapsdcn/sciie",
"dataset:zapsdcn/citation_intent",
"dataset:go_emotions",
"dataset:allenai/scicite",
"dataset:liar",
"dataset:relbert/lexical_relation_classification",
"dataset:tasksource/linguisticprobing",
"dataset:tasksource/crowdflower",
"dataset:metaeval/ethics",
"dataset:emo",
"dataset:google_wellformed_query",
"dataset:tweets_hate_speech_detection",
"dataset:has_part",
"dataset:blog_authorship_corpus",
"dataset:launch/open_question_type",
"dataset:health_fact",
"dataset:commonsense_qa",
"dataset:mc_taco",
"dataset:ade_corpus_v2",
"dataset:prajjwal1/discosense",
"dataset:circa",
"dataset:PiC/phrase_similarity",
"dataset:copenlu/scientific-exaggeration-detection",
"dataset:quarel",
"dataset:mwong/fever-evidence-related",
"dataset:numer_sense",
"dataset:dynabench/dynasent",
"dataset:raquiba/Sarcasm_News_Headline",
"dataset:sem_eval_2010_task_8",
"dataset:demo-org/auditor_review",
"dataset:medmcqa",
"dataset:RuyuanWan/Dynasent_Disagreement",
"dataset:RuyuanWan/Politeness_Disagreement",
"dataset:RuyuanWan/SBIC_Disagreement",
"dataset:RuyuanWan/SChem_Disagreement",
"dataset:RuyuanWan/Dilemmas_Disagreement",
"dataset:lucasmccabe/logiqa",
"dataset:wiki_qa",
"dataset:tasksource/cycic_classification",
"dataset:tasksource/cycic_multiplechoice",
"dataset:tasksource/sts-companion",
"dataset:tasksource/commonsense_qa_2.0",
"dataset:tasksource/lingnli",
"dataset:tasksource/monotonicity-entailment",
"dataset:tasksource/arct",
"dataset:tasksource/scinli",
"dataset:tasksource/naturallogic",
"dataset:onestop_qa",
"dataset:demelin/moral_stories",
"dataset:corypaik/prost",
"dataset:aps/dynahate",
"dataset:metaeval/syntactic-augmentation-nli",
"dataset:tasksource/autotnli",
"dataset:lasha-nlp/CONDAQA",
"dataset:openai/webgpt_comparisons",
"dataset:Dahoas/synthetic-instruct-gptj-pairwise",
"dataset:metaeval/scruples",
"dataset:metaeval/wouldyourather",
"dataset:metaeval/defeasible-nli",
"dataset:tasksource/help-nli",
"dataset:metaeval/nli-veridicality-transitivity",
"dataset:tasksource/lonli",
"dataset:tasksource/dadc-limit-nli",
"dataset:ColumbiaNLP/FLUTE",
"dataset:tasksource/strategy-qa",
"dataset:openai/summarize_from_feedback",
"dataset:tasksource/folio",
"dataset:yale-nlp/FOLIO",
"dataset:tasksource/tomi-nli",
"dataset:tasksource/avicenna",
"dataset:stanfordnlp/SHP",
"dataset:GBaker/MedQA-USMLE-4-options-hf",
"dataset:sileod/wikimedqa",
"dataset:declare-lab/cicero",
"dataset:amydeng2000/CREAK",
"dataset:tasksource/mutual",
"dataset:inverse-scaling/NeQA",
"dataset:inverse-scaling/quote-repetition",
"dataset:inverse-scaling/redefine-math",
"dataset:tasksource/puzzte",
"dataset:tasksource/implicatures",
"dataset:race",
"dataset:tasksource/race-c",
"dataset:tasksource/spartqa-yn",
"dataset:tasksource/spartqa-mchoice",
"dataset:tasksource/temporal-nli",
"dataset:riddle_sense",
"dataset:tasksource/clcd-english",
"dataset:maximedb/twentyquestions",
"dataset:metaeval/reclor",
"dataset:tasksource/counterfactually-augmented-imdb",
"dataset:tasksource/counterfactually-augmented-snli",
"dataset:metaeval/cnli",
"dataset:tasksource/boolq-natural-perturbations",
"dataset:metaeval/acceptability-prediction",
"dataset:metaeval/equate",
"dataset:tasksource/ScienceQA_text_only",
"dataset:Jiangjie/ekar_english",
"dataset:tasksource/implicit-hate-stg1",
"dataset:metaeval/chaos-mnli-ambiguity",
"dataset:IlyaGusev/headline_cause",
"dataset:tasksource/logiqa-2.0-nli",
"dataset:tasksource/oasst2_dense_flat",
"dataset:sileod/mindgames",
"dataset:metaeval/ambient",
"dataset:metaeval/path-naturalness-prediction",
"dataset:civil_comments",
"dataset:AndyChiang/cloth",
"dataset:AndyChiang/dgen",
"dataset:tasksource/I2D2",
"dataset:webis/args_me",
"dataset:webis/Touche23-ValueEval",
"dataset:tasksource/starcon",
"dataset:PolyAI/banking77",
"dataset:tasksource/ConTRoL-nli",
"dataset:tasksource/tracie",
"dataset:tasksource/sherliic",
"dataset:tasksource/sen-making",
"dataset:tasksource/winowhy",
"dataset:tasksource/robustLR",
"dataset:CLUTRR/v1",
"dataset:tasksource/logical-fallacy",
"dataset:tasksource/parade",
"dataset:tasksource/cladder",
"dataset:tasksource/subjectivity",
"dataset:tasksource/MOH",
"dataset:tasksource/VUAC",
"dataset:tasksource/TroFi",
"dataset:sharc_modified",
"dataset:tasksource/conceptrules_v2",
"dataset:metaeval/disrpt",
"dataset:tasksource/zero-shot-label-nli",
"dataset:tasksource/com2sense",
"dataset:tasksource/scone",
"dataset:tasksource/winodict",
"dataset:tasksource/fool-me-twice",
"dataset:tasksource/monli",
"dataset:tasksource/corr2cause",
"dataset:lighteval/lsat_qa",
"dataset:tasksource/apt",
"dataset:zeroshot/twitter-financial-news-sentiment",
"dataset:tasksource/icl-symbol-tuning-instruct",
"dataset:tasksource/SpaceNLI",
"dataset:sihaochen/propsegment",
"dataset:HannahRoseKirk/HatemojiBuild",
"dataset:tasksource/regset",
"dataset:tasksource/esci",
"dataset:lmsys/chatbot_arena_conversations",
"dataset:neurae/dnd_style_intents",
"dataset:hitachi-nlp/FLD.v2",
"dataset:tasksource/SDOH-NLI",
"dataset:allenai/scifact_entailment",
"dataset:tasksource/feasibilityQA",
"dataset:tasksource/simple_pair",
"dataset:tasksource/AdjectiveScaleProbe-nli",
"dataset:tasksource/resnli",
"dataset:tasksource/SpaRTUN",
"dataset:tasksource/ReSQ",
"dataset:tasksource/semantic_fragments_nli",
"dataset:MoritzLaurer/dataset_train_nli",
"dataset:tasksource/stepgame",
"dataset:tasksource/nlgraph",
"dataset:tasksource/oasst2_pairwise_rlhf_reward",
"dataset:tasksource/hh-rlhf",
"dataset:tasksource/ruletaker",
"dataset:qbao775/PARARULE-Plus",
"dataset:tasksource/proofwriter",
"dataset:tasksource/logical-entailment",
"dataset:tasksource/nope",
"dataset:tasksource/LogicNLI",
"dataset:kiddothe2b/contract-nli",
"dataset:AshtonIsNotHere/nli4ct_semeval2024",
"dataset:tasksource/lsat-ar",
"dataset:tasksource/lsat-rc",
"dataset:AshtonIsNotHere/biosift-nli",
"dataset:tasksource/brainteasers",
"dataset:Anthropic/persuasion",
"dataset:erbacher/AmbigNQ-clarifying-question",
"dataset:tasksource/SIGA-nli",
"dataset:unigram/FOL-nli",
"dataset:tasksource/goal-step-wikihow",
"dataset:GGLab/PARADISE",
"dataset:tasksource/doc-nli",
"dataset:tasksource/mctest-nli",
"dataset:tasksource/patent-phrase-similarity",
"dataset:tasksource/natural-language-satisfiability",
"dataset:tasksource/idioms-nli",
"dataset:tasksource/lifecycle-entailment",
"dataset:nvidia/HelpSteer",
"dataset:nvidia/HelpSteer2",
"dataset:sadat2307/MSciNLI",
"dataset:pushpdeep/UltraFeedback-paired",
"dataset:tasksource/AES2-essay-scoring",
"dataset:tasksource/english-grading",
"dataset:tasksource/wice",
"dataset:Dzeniks/hover",
"dataset:sileod/missing-item-prediction",
"dataset:tasksource/tasksource_dpo_pairs",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
]
| 2025-01-06T04:33:44 | 2025-01-20T12:06:32 | 99 | 1 | ---
datasets:
- zeroMN/nlp_corpus_zh
- zeroMN/hanlp_date-zh
- nyu-mll/glue
- aps/super_glue
- facebook/anli
- tasksource/babi_nli
- zeroMN/AVEdate
- sick
- snli
- scitail
- hans
- alisawuffles/WANLI
- tasksource/recast
- sileod/probability_words_nli
- joey234/nan-nli
- pietrolesci/nli_fever
- pietrolesci/breaking_nli
- pietrolesci/conj_nli
- pietrolesci/fracas
- pietrolesci/dialogue_nli
- pietrolesci/mpe
- pietrolesci/dnc
- pietrolesci/recast_white
- pietrolesci/joci
- pietrolesci/robust_nli
- pietrolesci/robust_nli_is_sd
- pietrolesci/robust_nli_li_ts
- pietrolesci/gen_debiased_nli
- pietrolesci/add_one_rte
- tasksource/imppres
- hlgd
- paws
- medical_questions_pairs
- Anthropic/model-written-evals
- truthful_qa
- nightingal3/fig-qa
- tasksource/bigbench
- blimp
- cos_e
- cosmos_qa
- dream
- openbookqa
- qasc
- quartz
- quail
- head_qa
- sciq
- social_i_qa
- wiki_hop
- wiqa
- piqa
- hellaswag
- pkavumba/balanced-copa
- 12ml/e-CARE
- art
- winogrande
- codah
- ai2_arc
- definite_pronoun_resolution
- swag
- math_qa
- metaeval/utilitarianism
- mteb/amazon_counterfactual
- SetFit/insincere-questions
- SetFit/toxic_conversations
- turingbench/TuringBench
- trec
- tals/vitaminc
- hope_edi
- strombergnlp/rumoureval_2019
- ethos
- tweet_eval
- discovery
- pragmeval
- silicone
- lex_glue
- papluca/language-identification
- imdb
- rotten_tomatoes
- ag_news
- yelp_review_full
- financial_phrasebank
- poem_sentiment
- dbpedia_14
- amazon_polarity
- app_reviews
- hate_speech18
- sms_spam
- humicroedit
- snips_built_in_intents
- hate_speech_offensive
- yahoo_answers_topics
- pacovaldez/stackoverflow-questions
- zapsdcn/hyperpartisan_news
- zapsdcn/sciie
- zapsdcn/citation_intent
- go_emotions
- allenai/scicite
- liar
- relbert/lexical_relation_classification
- tasksource/linguisticprobing
- tasksource/crowdflower
- metaeval/ethics
- emo
- google_wellformed_query
- tweets_hate_speech_detection
- has_part
- blog_authorship_corpus
- launch/open_question_type
- health_fact
- commonsense_qa
- mc_taco
- ade_corpus_v2
- prajjwal1/discosense
- circa
- PiC/phrase_similarity
- copenlu/scientific-exaggeration-detection
- quarel
- mwong/fever-evidence-related
- numer_sense
- dynabench/dynasent
- raquiba/Sarcasm_News_Headline
- sem_eval_2010_task_8
- demo-org/auditor_review
- medmcqa
- RuyuanWan/Dynasent_Disagreement
- RuyuanWan/Politeness_Disagreement
- RuyuanWan/SBIC_Disagreement
- RuyuanWan/SChem_Disagreement
- RuyuanWan/Dilemmas_Disagreement
- lucasmccabe/logiqa
- wiki_qa
- tasksource/cycic_classification
- tasksource/cycic_multiplechoice
- tasksource/sts-companion
- tasksource/commonsense_qa_2.0
- tasksource/lingnli
- tasksource/monotonicity-entailment
- tasksource/arct
- tasksource/scinli
- tasksource/naturallogic
- onestop_qa
- demelin/moral_stories
- corypaik/prost
- aps/dynahate
- metaeval/syntactic-augmentation-nli
- tasksource/autotnli
- lasha-nlp/CONDAQA
- openai/webgpt_comparisons
- Dahoas/synthetic-instruct-gptj-pairwise
- metaeval/scruples
- metaeval/wouldyourather
- metaeval/defeasible-nli
- tasksource/help-nli
- metaeval/nli-veridicality-transitivity
- tasksource/lonli
- tasksource/dadc-limit-nli
- ColumbiaNLP/FLUTE
- tasksource/strategy-qa
- openai/summarize_from_feedback
- tasksource/folio
- yale-nlp/FOLIO
- tasksource/tomi-nli
- tasksource/avicenna
- stanfordnlp/SHP
- GBaker/MedQA-USMLE-4-options-hf
- sileod/wikimedqa
- declare-lab/cicero
- amydeng2000/CREAK
- tasksource/mutual
- inverse-scaling/NeQA
- inverse-scaling/quote-repetition
- inverse-scaling/redefine-math
- tasksource/puzzte
- tasksource/implicatures
- race
- tasksource/race-c
- tasksource/spartqa-yn
- tasksource/spartqa-mchoice
- tasksource/temporal-nli
- riddle_sense
- tasksource/clcd-english
- maximedb/twentyquestions
- metaeval/reclor
- tasksource/counterfactually-augmented-imdb
- tasksource/counterfactually-augmented-snli
- metaeval/cnli
- tasksource/boolq-natural-perturbations
- metaeval/acceptability-prediction
- metaeval/equate
- tasksource/ScienceQA_text_only
- Jiangjie/ekar_english
- tasksource/implicit-hate-stg1
- metaeval/chaos-mnli-ambiguity
- IlyaGusev/headline_cause
- tasksource/logiqa-2.0-nli
- tasksource/oasst2_dense_flat
- sileod/mindgames
- metaeval/ambient
- metaeval/path-naturalness-prediction
- civil_comments
- AndyChiang/cloth
- AndyChiang/dgen
- tasksource/I2D2
- webis/args_me
- webis/Touche23-ValueEval
- tasksource/starcon
- PolyAI/banking77
- tasksource/ConTRoL-nli
- tasksource/tracie
- tasksource/sherliic
- tasksource/sen-making
- tasksource/winowhy
- tasksource/robustLR
- CLUTRR/v1
- tasksource/logical-fallacy
- tasksource/parade
- tasksource/cladder
- tasksource/subjectivity
- tasksource/MOH
- tasksource/VUAC
- tasksource/TroFi
- sharc_modified
- tasksource/conceptrules_v2
- metaeval/disrpt
- tasksource/zero-shot-label-nli
- tasksource/com2sense
- tasksource/scone
- tasksource/winodict
- tasksource/fool-me-twice
- tasksource/monli
- tasksource/corr2cause
- lighteval/lsat_qa
- tasksource/apt
- zeroshot/twitter-financial-news-sentiment
- tasksource/icl-symbol-tuning-instruct
- tasksource/SpaceNLI
- sihaochen/propsegment
- HannahRoseKirk/HatemojiBuild
- tasksource/regset
- tasksource/esci
- lmsys/chatbot_arena_conversations
- neurae/dnd_style_intents
- hitachi-nlp/FLD.v2
- tasksource/SDOH-NLI
- allenai/scifact_entailment
- tasksource/feasibilityQA
- tasksource/simple_pair
- tasksource/AdjectiveScaleProbe-nli
- tasksource/resnli
- tasksource/SpaRTUN
- tasksource/ReSQ
- tasksource/semantic_fragments_nli
- MoritzLaurer/dataset_train_nli
- tasksource/stepgame
- tasksource/nlgraph
- tasksource/oasst2_pairwise_rlhf_reward
- tasksource/hh-rlhf
- tasksource/ruletaker
- qbao775/PARARULE-Plus
- tasksource/proofwriter
- tasksource/logical-entailment
- tasksource/nope
- tasksource/LogicNLI
- kiddothe2b/contract-nli
- AshtonIsNotHere/nli4ct_semeval2024
- tasksource/lsat-ar
- tasksource/lsat-rc
- AshtonIsNotHere/biosift-nli
- tasksource/brainteasers
- Anthropic/persuasion
- erbacher/AmbigNQ-clarifying-question
- tasksource/SIGA-nli
- unigram/FOL-nli
- tasksource/goal-step-wikihow
- GGLab/PARADISE
- tasksource/doc-nli
- tasksource/mctest-nli
- tasksource/patent-phrase-similarity
- tasksource/natural-language-satisfiability
- tasksource/idioms-nli
- tasksource/lifecycle-entailment
- nvidia/HelpSteer
- nvidia/HelpSteer2
- sadat2307/MSciNLI
- pushpdeep/UltraFeedback-paired
- tasksource/AES2-essay-scoring
- tasksource/english-grading
- tasksource/wice
- Dzeniks/hover
- sileod/missing-item-prediction
- tasksource/tasksource_dpo_pairs
language:
- en
- zh
library_name: transformers
license: apache-2.0
metrics:
- accuracy
- bleu
- wer
pipeline_tag: audio-text-to-text
tags:
- multimodal
- vqa
- text
- audio
widget:
- text: My name is Sylvain and I live in Paris
example_title: Parisian
- text: My name is Sarah and I live in London
example_title: Londoner
model-index:
- name: Evolutionary Multi-Modal Model
results:
- task:
type: vqa
name: Visual Question Answering
dataset:
name: Synthetic Multimodal Dataset
type: synthetic-dataset
split: test
metrics:
- type: accuracy
value: 85
---
### Model Sources
You need to use separate code, audio, text, and natural language together with the model. Because the model will use separate word segmenters and vocabularies to achieve the best results when dealing with special cases.
--
- **Repository:** [https://zeromn-zeromn-shmt.hf.space]
- **kaggle:** [https://www.kaggle.com/models/zeroeva/evolutionary-multi-modal) (https://www.kaggle.com/models/zeroeva/evolutionary-multi-modal)
- **Demo:** [https://zeromn-zeromn-shmt.hf.space]
## Multi-Modal Model
# Model Card for Evolutionary
--
<script
type="module"
src="https://gradio.s3-us-west-2.amazonaws.com/5.12.0/gradio.js"
></script>
<gradio-app src="https://zeromn-zeromn-shmt.hf.space"></gradio-app>
-
### Model breast_cancer_wisconsin_original test
```python
from ucimlrepo import fetch_ucirepo
fetch dataset
breast_cancer_wisconsin_original = fetch_ucirepo(id=15)
data (as pandas dataframes)
X = breast_cancer_wisconsin_original.data.features
y = breast_cancer_wisconsin_original.data.targets
metadata
print(breast_cancer_wisconsin_original.metadata)
variable information
print(breast_cancer_wisconsin_original.variables)
```
##########################################################
-
# 0 0.93 0.99 0.96 79
# 1 0.98 0.90 0.94 58
--
#accuracy 0.95 137
--
--
This model, named `Evolutionary Multi-Modal Model`, is a multimodal transformer designed to handle a variety of tasks including vision and audio processing. It is built on top of the `adapter-transformers` and `transformers` libraries and is intended to be a versatile base model for both direct use and fine-tuning.
-
--
**Developed by:** Independent researcher
**Funded by :** Self-funded
**Shared by :** Independent researcher
**Model type:** Multimodal
**Language(s) (NLP):** English zh
**License:** Apache-2.0
**Finetuned from model :** None
-
## Uses:https://huggingface.co/zeroMN/SHMT
### Direct Use
```python
git lfs install
git clone https://huggingface.co/zeroMN/SHMT.git
```
### Downstream Use
The model can be fine-tuned for specific tasks such as visual question answering (VQA), image captioning, and audio recognition.
### Out-of-Scope Use
The Evolved Multimodal Model is not suitable for tasks that require high expertise or domain-specific expertise beyond its current capabilities. The number of speech frames still needs to be fine-tuned by yourself.
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the following risks, biases, and limitations:
- **Bias:** The model may exhibit biases present in the training data, particularly if the data is not representative of all populations.
- **Risks:** The model should not be used in critical applications where high accuracy and reliability are required without thorough testing and validation.
- **Limitations:** The model may not perform well on tasks that require fine-grained recognition or highly specialized audio processing.
## How to Get Started with the Model
```python
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-generation", model="zeroMN/SHMT")
```
```python
# Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("zeroMN/SHMT")
``` | [
"QUESTION_ANSWERING"
]
| [
"HEAD-QA",
"MEDQA",
"SCICITE",
"SCIFACT",
"SCIQ",
"SCITAIL"
]
| Non_BioNLP | ### Model Sources
You need to use separate code, audio, text, and natural language together with the model. Because the model will use separate word segmenters and vocabularies to achieve the best results when dealing with special cases.
--
- **Repository:** [https://zeromn-zeromn-shmt.hf.space]
- **kaggle:** [https://www.kaggle.com/models/zeroeva/evolutionary-multi-modal) (https://www.kaggle.com/models/zeroeva/evolutionary-multi-modal)
- **Demo:** [https://zeromn-zeromn-shmt.hf.space]
## Multi-Modal Model
# Model Card for Evolutionary
--
<script
type="module"
src="https://gradio.s3-us-west-2.amazonaws.com/5.12.0/gradio.js"
></script>
<gradio-app src="https://zeromn-zeromn-shmt.hf.space"></gradio-app>
-
### Model breast_cancer_wisconsin_original test
```python
from ucimlrepo import fetch_ucirepo
fetch dataset
breast_cancer_wisconsin_original = fetch_ucirepo(id=15)
data (as pandas dataframes)
X = breast_cancer_wisconsin_original.data.features
y = breast_cancer_wisconsin_original.data.targets
metadata
print(breast_cancer_wisconsin_original.metadata)
variable information
print(breast_cancer_wisconsin_original.variables)
```
##########################################################
-
# 0 0.93 0.99 0.96 79
# 1 0.98 0.90 0.94 58
--
#accuracy 0.95 137
--
--
This model, named `Evolutionary Multi-Modal Model`, is a multimodal transformer designed to handle a variety of tasks including vision and audio processing. It is built on top of the `adapter-transformers` and `transformers` libraries and is intended to be a versatile base model for both direct use and fine-tuning.
-
--
**Developed by:** Independent researcher
**Funded by :** Self-funded
**Shared by :** Independent researcher
**Model type:** Multimodal
**Language(s) (NLP):** English zh
**License:** Apache-2.0
**Finetuned from model :** None
-
## Uses:https://huggingface.co/zeroMN/SHMT
### Direct Use
```python
git lfs install
git clone https://huggingface.co/zeroMN/SHMT.git
```
### Downstream Use
The model can be fine-tuned for specific tasks such as visual question answering (VQA), image captioning, and audio recognition.
### Out-of-Scope Use
The Evolved Multimodal Model is not suitable for tasks that require high expertise or domain-specific expertise beyond its current capabilities. The number of speech frames still needs to be fine-tuned by yourself.
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the following risks, biases, and limitations:
- **Bias:** The model may exhibit biases present in the training data, particularly if the data is not representative of all populations.
- **Risks:** The model should not be used in critical applications where high accuracy and reliability are required without thorough testing and validation.
- **Limitations:** The model may not perform well on tasks that require fine-grained recognition or highly specialized audio processing.
## How to Get Started with the Model
```python
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-generation", model="zeroMN/SHMT")
```
```python
# Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("zeroMN/SHMT")
``` | {"datasets": ["zeroMN/nlp_corpus_zh", "zeroMN/hanlp_date-zh", "nyu-mll/glue", "aps/super_glue", "facebook/anli", "tasksource/babi_nli", "zeroMN/AVEdate", "sick", "snli", "scitail", "hans", "alisawuffles/WANLI", "tasksource/recast", "sileod/probability_words_nli", "joey234/nan-nli", "pietrolesci/nli_fever", "pietrolesci/breaking_nli", "pietrolesci/conj_nli", "pietrolesci/fracas", "pietrolesci/dialogue_nli", "pietrolesci/mpe", "pietrolesci/dnc", "pietrolesci/recast_white", "pietrolesci/joci", "pietrolesci/robust_nli", "pietrolesci/robust_nli_is_sd", "pietrolesci/robust_nli_li_ts", "pietrolesci/gen_debiased_nli", "pietrolesci/add_one_rte", "tasksource/imppres", "hlgd", "paws", "medical_questions_pairs", "Anthropic/model-written-evals", "truthful_qa", "nightingal3/fig-qa", "tasksource/bigbench", "blimp", "cos_e", "cosmos_qa", "dream", "openbookqa", "qasc", "quartz", "quail", "head_qa", "sciq", "social_i_qa", "wiki_hop", "wiqa", "piqa", "hellaswag", "pkavumba/balanced-copa", "12ml/e-CARE", "art", "winogrande", "codah", "ai2_arc", "definite_pronoun_resolution", "swag", "math_qa", "metaeval/utilitarianism", "mteb/amazon_counterfactual", "SetFit/insincere-questions", "SetFit/toxic_conversations", "turingbench/TuringBench", "trec", "tals/vitaminc", "hope_edi", "strombergnlp/rumoureval_2019", "ethos", "tweet_eval", "discovery", "pragmeval", "silicone", "lex_glue", "papluca/language-identification", "imdb", "rotten_tomatoes", "ag_news", "yelp_review_full", "financial_phrasebank", "poem_sentiment", "dbpedia_14", "amazon_polarity", "app_reviews", "hate_speech18", "sms_spam", "humicroedit", "snips_built_in_intents", "hate_speech_offensive", "yahoo_answers_topics", "pacovaldez/stackoverflow-questions", "zapsdcn/hyperpartisan_news", "zapsdcn/sciie", "zapsdcn/citation_intent", "go_emotions", "allenai/scicite", "liar", "relbert/lexical_relation_classification", "tasksource/linguisticprobing", "tasksource/crowdflower", "metaeval/ethics", "emo", "google_wellformed_query", "tweets_hate_speech_detection", "has_part", "blog_authorship_corpus", "launch/open_question_type", "health_fact", "commonsense_qa", "mc_taco", "ade_corpus_v2", "prajjwal1/discosense", "circa", "PiC/phrase_similarity", "copenlu/scientific-exaggeration-detection", "quarel", "mwong/fever-evidence-related", "numer_sense", "dynabench/dynasent", "raquiba/Sarcasm_News_Headline", "sem_eval_2010_task_8", "demo-org/auditor_review", "medmcqa", "RuyuanWan/Dynasent_Disagreement", "RuyuanWan/Politeness_Disagreement", "RuyuanWan/SBIC_Disagreement", "RuyuanWan/SChem_Disagreement", "RuyuanWan/Dilemmas_Disagreement", "lucasmccabe/logiqa", "wiki_qa", "tasksource/cycic_classification", "tasksource/cycic_multiplechoice", "tasksource/sts-companion", "tasksource/commonsense_qa_2.0", "tasksource/lingnli", "tasksource/monotonicity-entailment", "tasksource/arct", "tasksource/scinli", "tasksource/naturallogic", "onestop_qa", "demelin/moral_stories", "corypaik/prost", "aps/dynahate", "metaeval/syntactic-augmentation-nli", "tasksource/autotnli", "lasha-nlp/CONDAQA", "openai/webgpt_comparisons", "Dahoas/synthetic-instruct-gptj-pairwise", "metaeval/scruples", "metaeval/wouldyourather", "metaeval/defeasible-nli", "tasksource/help-nli", "metaeval/nli-veridicality-transitivity", "tasksource/lonli", "tasksource/dadc-limit-nli", "ColumbiaNLP/FLUTE", "tasksource/strategy-qa", "openai/summarize_from_feedback", "tasksource/folio", "yale-nlp/FOLIO", "tasksource/tomi-nli", "tasksource/avicenna", "stanfordnlp/SHP", "GBaker/MedQA-USMLE-4-options-hf", "sileod/wikimedqa", "declare-lab/cicero", "amydeng2000/CREAK", "tasksource/mutual", "inverse-scaling/NeQA", "inverse-scaling/quote-repetition", "inverse-scaling/redefine-math", "tasksource/puzzte", "tasksource/implicatures", "race", "tasksource/race-c", "tasksource/spartqa-yn", "tasksource/spartqa-mchoice", "tasksource/temporal-nli", "riddle_sense", "tasksource/clcd-english", "maximedb/twentyquestions", "metaeval/reclor", "tasksource/counterfactually-augmented-imdb", "tasksource/counterfactually-augmented-snli", "metaeval/cnli", "tasksource/boolq-natural-perturbations", "metaeval/acceptability-prediction", "metaeval/equate", "tasksource/ScienceQA_text_only", "Jiangjie/ekar_english", "tasksource/implicit-hate-stg1", "metaeval/chaos-mnli-ambiguity", "IlyaGusev/headline_cause", "tasksource/logiqa-2.0-nli", "tasksource/oasst2_dense_flat", "sileod/mindgames", "metaeval/ambient", "metaeval/path-naturalness-prediction", "civil_comments", "AndyChiang/cloth", "AndyChiang/dgen", "tasksource/I2D2", "webis/args_me", "webis/Touche23-ValueEval", "tasksource/starcon", "PolyAI/banking77", "tasksource/ConTRoL-nli", "tasksource/tracie", "tasksource/sherliic", "tasksource/sen-making", "tasksource/winowhy", "tasksource/robustLR", "CLUTRR/v1", "tasksource/logical-fallacy", "tasksource/parade", "tasksource/cladder", "tasksource/subjectivity", "tasksource/MOH", "tasksource/VUAC", "tasksource/TroFi", "sharc_modified", "tasksource/conceptrules_v2", "metaeval/disrpt", "tasksource/zero-shot-label-nli", "tasksource/com2sense", "tasksource/scone", "tasksource/winodict", "tasksource/fool-me-twice", "tasksource/monli", "tasksource/corr2cause", "lighteval/lsat_qa", "tasksource/apt", "zeroshot/twitter-financial-news-sentiment", "tasksource/icl-symbol-tuning-instruct", "tasksource/SpaceNLI", "sihaochen/propsegment", "HannahRoseKirk/HatemojiBuild", "tasksource/regset", "tasksource/esci", "lmsys/chatbot_arena_conversations", "neurae/dnd_style_intents", "hitachi-nlp/FLD.v2", "tasksource/SDOH-NLI", "allenai/scifact_entailment", "tasksource/feasibilityQA", "tasksource/simple_pair", "tasksource/AdjectiveScaleProbe-nli", "tasksource/resnli", "tasksource/SpaRTUN", "tasksource/ReSQ", "tasksource/semantic_fragments_nli", "MoritzLaurer/dataset_train_nli", "tasksource/stepgame", "tasksource/nlgraph", "tasksource/oasst2_pairwise_rlhf_reward", "tasksource/hh-rlhf", "tasksource/ruletaker", "qbao775/PARARULE-Plus", "tasksource/proofwriter", "tasksource/logical-entailment", "tasksource/nope", "tasksource/LogicNLI", "kiddothe2b/contract-nli", "AshtonIsNotHere/nli4ct_semeval2024", "tasksource/lsat-ar", "tasksource/lsat-rc", "AshtonIsNotHere/biosift-nli", "tasksource/brainteasers", "Anthropic/persuasion", "erbacher/AmbigNQ-clarifying-question", "tasksource/SIGA-nli", "unigram/FOL-nli", "tasksource/goal-step-wikihow", "GGLab/PARADISE", "tasksource/doc-nli", "tasksource/mctest-nli", "tasksource/patent-phrase-similarity", "tasksource/natural-language-satisfiability", "tasksource/idioms-nli", "tasksource/lifecycle-entailment", "nvidia/HelpSteer", "nvidia/HelpSteer2", "sadat2307/MSciNLI", "pushpdeep/UltraFeedback-paired", "tasksource/AES2-essay-scoring", "tasksource/english-grading", "tasksource/wice", "Dzeniks/hover", "sileod/missing-item-prediction", "tasksource/tasksource_dpo_pairs"], "language": ["en", "zh"], "library_name": "transformers", "license": "apache-2.0", "metrics": ["accuracy", "bleu", "wer"], "pipeline_tag": "audio-text-to-text", "tags": ["multimodal", "vqa", "text", "audio"], "widget": [{"text": "My name is Sylvain and I live in Paris", "example_title": "Parisian"}, {"text": "My name is Sarah and I live in London", "example_title": "Londoner"}], "model-index": [{"name": "Evolutionary Multi-Modal Model", "results": [{"task": {"type": "vqa", "name": "Visual Question Answering"}, "dataset": {"name": "Synthetic Multimodal Dataset", "type": "synthetic-dataset", "split": "test"}, "metrics": [{"type": "accuracy", "value": 85}]}]}]} |
zjunlp/OneKE | zjunlp | text-generation | [
"transformers",
"pytorch",
"llama",
"text-generation",
"en",
"zh",
"dataset:zjunlp/iepile",
"dataset:zjunlp/InstructIE",
"arxiv:2402.14710",
"license:cc-by-nc-sa-4.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
]
| 2024-02-23T09:28:16 | 2024-05-06T09:49:31 | 353 | 42 | ---
datasets:
- zjunlp/iepile
- zjunlp/InstructIE
language:
- en
- zh
license: cc-by-nc-sa-4.0
---
<p align="center">
<a href="https://github.com/zjunlp/deepke"> <img src="assets/oneke_logo.png" width="400"/></a>
<p>
<p align="center">
<a href="https://oneke.openkg.cn/">
<img alt="Documentation" src="https://img.shields.io/badge/demo-website-blue">
</a>
<a href="https://pypi.org/project/deepke/#files">
<img alt="PyPI" src="https://img.shields.io/pypi/v/deepke">
</a>
<a href="https://github.com/zjunlp/DeepKE/blob/master/LICENSE">
<img alt="GitHub" src="https://img.shields.io/github/license/zjunlp/deepke">
</a>
<a href="http://zjunlp.github.io/DeepKE">
<img alt="Documentation" src="https://img.shields.io/badge/doc-website-red">
</a>
</p>
<h1 align="center">
<p>OneKE: A Bilingual Large Language Model for <br>Knowledge Extraction</p>
</h1>
- [What is OneKE?](#what-is-oneke)
- [How is OneKE trained?](#how-is-oneke-trained)
- [Getting Started with OneKE](#getting-started-with-oneke)
- [Quick Start](#quick-start)
- [Advanced Use of OneKE](#advanced-use-of-oneke)
- [OneKE Instruction Format](#oneke-instruction-format)
- [Conversion of OneKE Instruction Format](#conversion-of-oneke-instruction-format)
- [Customized Schema Description Instructions](#customized-schema-description-instructions)
- [Evaluation](#evaluation)
- [Continue Training](#continue-training)
- [Citation](#citation)
## What is OneKE?
OneKE is a large-scale model framework for knowledge extraction jointly developed by Ant Group and Zhejiang University. It possesses the capability of generalized knowledge extraction in bilingual Chinese and English, across multiple domains and tasks, and provides comprehensive toolchain support. OneKE has contributed to the OpenKG open knowledge graph community in an open-source manner.
Knowledge construction based on unstructured documents has always been one of the key challenges for the large-scale implementation of knowledge graphs. The high fragmentation and unstructured nature of real-world information, along with the substantial disparities between extracted content and its natural language expression, often result in the suboptimal performance of large language models in information extraction tasks. Natural language text often contains ambiguities, polysemies, and metaphors due to implicit and long-distance context associations, posing significant challenges for knowledge extraction tasks. In response to these issues, Ant Group and Zhejiang University leveraged their years of expertise in knowledge graphs and natural language processing to jointly construct and upgrade the capabilities of Ant's large-scale model "BaiLing" in the field of knowledge extraction. They released the bilingual knowledge extraction framework OneKE which included a version based on full parametric fine-tuning of Chinese-Alpaca-2-13B. Evaluation metrics show that OneKE has achieved relatively good performance on several fully supervised and zero-shot entity/relation/event extraction tasks.
The unified knowledge extraction framework has wide application scenarios and can significantly reduce the construction costs of domain-specific knowledge graphs. By extracting structured knowledge from massive datasets to construct high-quality knowledge graphs and establish logical associations between knowledge elements, interpretable inference and decision-making can be realized. It can also enhance large models by mitigating hallucination and boosting stability, accelerating the vertical domain applications of large models. For example, in the medical field, knowledge extraction can be used to convert doctors' experience into structured, rule-based management, building controlled auxiliary diagnostics, and medical Q&A systems. In the financial sector, it can extract financial indicators, risk events, causal logic, and industry chains for automated financial report generation, risk prediction, and industry chain analysis. In the public sector, it can facilitate knowledge-based management of government regulations, enhancing the efficiency and accuracy of public services.
<p align="center" width="100%">
<a href="" target="_blank"><img src="assets/oneke.gif" alt="OneKE" style="width: 100%; min-width: 20px; display: block; margin: auto;"></a>
</p>
## How is OneKE trained?
OneKE mainly focuses on schema-generalizable information extraction. Due to issues such as non-standard formats, noisy data, and lack of diversity in existing extraction instruction data, OneKE adopted techniques such as normalization and cleaning of extraction instructions, difficult negative sample collection, and schema-based batched instruction construction, as shown in the illustration. For more detailed information, refer to the paper "[IEPile: Unearthing Large-Scale Schema-Based Information Extraction Corpus](https://arxiv.org/abs/2402.14710) [[Github](https://github.com/zjunlp/IEPile)]".
The zero-shot generalization comparison results of OneKE with other large models are as follows:
* `NER-en`: CrossNER_AI, CrossNER_literature, CrossNER_music, CrossNER_politics, CrossNER_science
* `NER-zh`: WEIBONER, boson
* `RE-zh`: COAE2016, IPRE, SKE2020
* `RE-en`: FewRel, Wiki-ZSL
* `EE-en`: CrudeOilNews, WikiEvents, RAMS
* `EE-zh`: FewFC, CCF Law
<p align="center" width="50%">
<a href="" target="_blank"><img src="assets/oneke_results.png" alt="OneKE" style="width: 50%; min-width: 20px; display: block; margin: auto;"></a>
</p>


<details>
<summary><b>Supervision Results</b></summary>



</details>
## Getting Started with OneKE
### Quick Start
It is recommended to have at least **20GB of VRAM** for training and inferencing.
```python
import torch
from transformers import (
AutoConfig,
AutoTokenizer,
AutoModelForCausalLM,
GenerationConfig,
BitsAndBytesConfig
)
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
model_path = 'zjunlp/OneKE'
config = AutoConfig.from_pretrained(model_path, trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True)
# 4-bit Quantized OneKE
quantization_config=BitsAndBytesConfig(
load_in_4bit=True,
llm_int8_threshold=6.0,
llm_int8_has_fp16_weight=False,
bnb_4bit_compute_dtype=torch.bfloat16,
bnb_4bit_use_double_quant=True,
bnb_4bit_quant_type="nf4",
)
model = AutoModelForCausalLM.from_pretrained(
model_path,
config=config,
device_map="auto",
quantization_config=quantization_config,
torch_dtype=torch.bfloat16,
trust_remote_code=True,
)
model.eval()
system_prompt = '<<SYS>>\nYou are a helpful assistant. 你是一个乐于助人的助手。\n<</SYS>>\n\n'
sintruct = "{\"instruction\": \"You are an expert in named entity recognition. Please extract entities that match the schema definition from the input. Return an empty list if the entity type does not exist. Please respond in the format of a JSON string.\", \"schema\": [\"person\", \"organization\", \"else\", \"location\"], \"input\": \"284 Robert Allenby ( Australia ) 69 71 71 73 , Miguel Angel Martin ( Spain ) 75 70 71 68 ( Allenby won at first play-off hole )\"}"
sintruct = '[INST] ' + system_prompt + sintruct + '[/INST]'
input_ids = tokenizer.encode(sintruct, return_tensors="pt").to(device)
input_length = input_ids.size(1)
generation_output = model.generate(input_ids=input_ids, generation_config=GenerationConfig(max_length=1024, max_new_tokens=512, return_dict_in_generate=True))
generation_output = generation_output.sequences[0]
generation_output = generation_output[input_length:]
output = tokenizer.decode(generation_output, skip_special_tokens=True)
print(output)
```
For more detailed inference, please refer to [DeepKE-llm/InstructKGC/6.1.2IE专用模型](https://github.com/zjunlp/DeepKE/blob/main/example/llm/InstructKGC/README_CN.md/#612ie%E4%B8%93%E7%94%A8%E6%A8%A1%E5%9E%8B).
### Advanced Use of OneKE
### OneKE Instruction Format
The instructions in OneKE are formatted in a dictionary-type string similar to JSON. It consists of three fields:
(1) **`'instruction'`**, which is the task description, specifies in natural language the role the model plays and the task to be completed;
(2) **`'schema'`**, a list of labels to be extracted, clearly indicates the key fields of the information to be extracted, reflecting the user's needs, and is dynamic and changeable;
(3) **`'input'`**, refers to the source text for information extraction.
Below are examples of instructions for various tasks:
<details>
<summary><b>Named Entity Recognition (NER)</b></summary>
```json
{
"instruction": "You are an expert specializing in entity extraction. Please extract entities that comply with the schema definition from the input; return an empty list for non-existent entity types. Please respond in the JSON string format.",
"schema": ["Person Name", "Education", "Position", "Nationality"],
"input": "Mr. Liu Zhijian: Born in 1956, Chinese nationality, no permanent residency abroad, member of the Communist Party, associate degree, senior economist."
}
```
</details>
<details>
<summary><b>Relation Extraction (RE)</b></summary>
```json
{
"instruction": "You are an expert specializing in relation extraction. Please extract relationship triples that comply with the schema definition from the input; return an empty list for non-existent relationships. Please respond in the JSON string format.",
"schema": ["Father", "Husband", "Postal Code", "Mother"],
"input": "Ding Long took out his life savings of $12,000, which without a doubt was a substantial amount at the end of the 19th century, plus Carpentier's donation, they both funded Columbia University's sinology research together."
}
```
</details>
<details>
<summary><b>Knowledge Graph Construction (KGC)</b></summary>
```json
{
"instruction": "You are an expert in structuring knowledge about graph entities. Based on the schema description of the input entity type, extract the corresponding entity instances and their property information from the text; do not output non-existent properties, return a list if there are multiple values for a property, and provide the output in a parseable json format.",
"schema": [
{
"entity_type": "Person",
"attributes": ["Chinese Name", "English Name", "Ancestral Home", "Date of Birth", "Place of Birth", "Occupation", "Alma Mater", "Works", "Awards"]
}
],
"input": "Jay Chou (Jay Chou), born on January 18, 1979, in New Taipei City, Taiwan Province, ancestral home in Yongchun County, Quanzhou City, Fujian Province, Chinese pop singer, musician, actor, director, screenwriter, graduated from Tamkang High School. In 2000, he released his debut album 'Jay'. In 2001, he cemented his style of blending Eastern and Western music with the album 'Fantasy'. In 2002, he held ‘The One’ world tour; the same year, he won the Best Composer award at the 13th Taiwan Golden Melody Awards with the song 'Love Before the Century'."
}
```
</details>
<details>
<summary><b>Event Extraction (EE)</b></summary>
```json
{
"instruction": "You are an expert specializing in event extraction. Please extract events that match the defined schema from the input; return an empty list for non-existent events, NAN for non-existent arguments, and a list if there are multiple values for an argument. Please provide your response in JSON string format.",
"schema": [
{
"event_type": "Finance/Trading - Interest Rate Hike",
"trigger": true,
"arguments": [
"Time"
]
},
{
"event_type": "Finance/Trading - Interest Rate Cut",
"trigger": true,
"arguments": [
"Cut Magnitude"
]
},
{
"event_type": "Finance/Trading - Price Increase",
"trigger": true,
"arguments": [
"Price Raiser"
]
},
{
"event_type": "Finance/Trading - Price Cut",
"trigger": true,
"arguments": [
"Price Cutter",
"Time"
]
}
],
"input": "AI risk control solution provider Vezetech secures tens of millions of dollars in Series C+ funding"
}
```
</details>
<details>
<summary><b>Event Trigger Identification (EET)</b></summary>
```json
{
"instruction": "You are an expert specializing in event trigger identification. Please extract the event types and triggers that match the defined schema from the input; return an empty list if the event type doesn't exist. Please provide your response in JSON string format.",
"schema": ["Organizational Relationship - Dissolve", "Organizational Relationship - Layoff", "Organizational Relationship - Dismiss", "Competition Behavior - Promotion"],
"input": "Nestlé lays off 4,000 employees: When the times leave you behind, they won't even say goodbye!"
}
```
</details>
<details>
<summary><b>Event Argument Extraction (EEA)</b></summary>
```json
{
"instruction": "You are an expert specializing in event argument extraction. Please extract the event arguments and their roles that match the defined schema from the input; return NAN or an empty dictionary for non-existent arguments, and a list if there are multiple values for an argument. Please provide your response in JSON string format.",
"schema": [{"event_type": "Organizational Relationship - Resignation/Departure", "arguments": ["Resigner", "Time", "Former Organization"]}],
"input": "Nestlé lays off 4,000 employees: When the times leave you behind, they won't even say goodbye!"
}
```
</details>
> Note: In consideration of the complexity of information extraction within specific domains and the high reliance on prompts, we support the integration of Schema descriptions and examples in the instructions to enhance the effectiveness of extraction tasks. For details, refer to **`Customized Schema Description Instructions`** and **`Customized Example Instructions`**. Please understand that due to the limited scale of the model, the model output is prompt-dependent and different prompts may yield inconsistent results.
### Conversion of OneKE Instruction Format
**List of Instructions**:
```python
instruction_mapper = {
'NERzh': "你是专门进行实体抽取的专家。请从input中抽取出符合schema定义的实体,不存在的实体类型返回空列表。请按照JSON字符串的格式回答。",
'REzh': "你是专门进行关系抽取的专家。请从input中抽取出符合schema定义的关系三元组,不存在的关系返回空列表。请按照JSON字符串的格式回答。",
'EEzh': "你是专门进行事件提取的专家。请从input中抽取出符合schema定义的事件,不存在的事件返回空列表,不存在的论元返回NAN,如果论元存在多值请返回列表。请按照JSON字符串的格式回答。",
'EETzh': "你是专门进行事件提取的专家。请从input中抽取出符合schema定义的事件类型及事件触发词,不存在的事件返回空列表。请按照JSON字符串的格式回答。",
'EEAzh': "你是专门进行事件论元提取的专家。请从input中抽取出符合schema定义的事件论元及论元角色,不存在的论元返回NAN或空字典,如果论元存在多值请返回列表。请按照JSON字符串的格式回答。",
'KGzh': '你是一个图谱实体知识结构化专家。根据输入实体类型(entity type)的schema描述,从文本中抽取出相应的实体实例和其属性信息,不存在的属性不输出, 属性存在多值就返回列表,并输出为可解析的json格式。',
'NERen': "You are an expert in named entity recognition. Please extract entities that match the schema definition from the input. Return an empty list if the entity type does not exist. Please respond in the format of a JSON string.",
'REen': "You are an expert in relationship extraction. Please extract relationship triples that match the schema definition from the input. Return an empty list for relationships that do not exist. Please respond in the format of a JSON string.",
'EEen': "You are an expert in event extraction. Please extract events from the input that conform to the schema definition. Return an empty list for events that do not exist, and return NAN for arguments that do not exist. If an argument has multiple values, please return a list. Respond in the format of a JSON string.",
'EETen': "You are an expert in event extraction. Please extract event types and event trigger words from the input that conform to the schema definition. Return an empty list for non-existent events. Please respond in the format of a JSON string.",
'EEAen': "You are an expert in event argument extraction. Please extract event arguments and their roles from the input that conform to the schema definition, which already includes event trigger words. If an argument does not exist, return NAN or an empty dictionary. Please respond in the format of a JSON string.",
'KGen': 'You are an expert in structured knowledge systems for graph entities. Based on the schema description of the input entity type, you extract the corresponding entity instances and their attribute information from the text. Attributes that do not exist should not be output. If an attribute has multiple values, a list should be returned. The results should be output in a parsable JSON format.',
}
```
Recommended **Split Numbers** for Each Task:
```python
split_num_mapper = {
'NER':6, 'RE':4, 'EE':4, 'EET':4, 'EEA':4, 'KG':1
}
```
Since predicting all schemas in the label set at once is too challenging and not easily scalable, OneKE uses a batched approach during training. It divides the number of schemas asked in the instructions, querying a fixed number of schemas at a time. Hence, if the label set of a piece of data is too long, it will be split into multiple instructions that the model will address in turns.
**Schema Format**:
```python
NER: ["Person Name", "Education", "Position", "Nationality"] # List of strings
RE: ["Father", "Husband", "Postal Code", "Mother"] # List of strings
EE: [{"event_type": "Finance/Trading - Interest Rate Hike", "trigger": True, "arguments": ["Time"]}, {"event_type": "Finance/Trading - Interest Rate Cut", "trigger": True, "arguments": ["Cut Magnitude"]}] # List of dictionaries, "event_type" is a string, "trigger" is a bool, "arguments" is a list
EET: ["Organizational Relationship - Dissolution", "Organizational Relationship - Layoff", "Organizational Relationship - Dismissal", "Competition Behavior - Advancement"] # List of strings
EEA: [{"event_type": "Finance/Trading - Interest Rate Hike", "arguments": ["Time"]}, {"event_type": "Finance/Trading - Interest Rate Cut", "arguments": ["Cut Magnitude"]}] # List of dictionaries, "event_type" is a string, "arguments" is a list
```
Below is a simple Batched Instruction Generation script:
```python
def get_instruction(language, task, schema, input):
sintructs = []
split_num = split_num_mapper[task]
if type(schema) == dict:
sintruct = json.dumps({'instruction':instruction_mapper[task+language], 'schema':schema, 'input':input}, ensure_ascii=False)
sintructs.append(sintruct)
else:
split_schemas = [schema[i:i+split_num] for i in range(0, len(schema), split_num)]
for split_schema in split_schemas:
sintruct = json.dumps({'instruction':instruction_mapper[task+language], 'schema':split_schema, 'input':input}, ensure_ascii=False)
sintructs.append(sintruct)
return sintructs
```
Below is an example using the aforementioned simple script:
```python
task = 'NER'
language = 'en'
schema = ['person', 'organization', 'else', 'location']
split_num = split_num_mapper[task]
split_schemas = [schema[i:i+split_num] for i in range(0, len(schema), split_num)]
input = '284 Robert Allenby ( Australia ) 69 71 71 73 , Miguel Angel Martin ( Spain ) 75 70 71 68 ( Allenby won at first play-off hole )'
sintructs = []
for split_schema in split_schemas:
sintruct = json.dumps({'instruction':instruction_mapper[task+language], 'schema':split_schema, 'input':input}, ensure_ascii=False)
sintructs.append(sintruct)
```
> '{"instruction": "You are an expert in named entity recognition. Please extract entities that match the schema definition from the input. Return an empty list if the entity type does not exist. Please respond in the format of a JSON string.", "schema": ["person", "organization", "else", "location"], "input": "284 Robert Allenby ( Australia ) 69 71 71 73 , Miguel Angel Martin ( Spain ) 75 70 71 68 ( Allenby won at first play-off hole )"}'
For more detailed data conversion, please refer to [DeepKE-llm/InstructKGC/README_CN.md/2.3测试数据转换](https://github.com/zjunlp/DeepKE/blob/main/example/llm/InstructKGC/README_CN.md/#23%E6%B5%8B%E8%AF%95%E6%95%B0%E6%8D%AE%E8%BD%AC%E6%8D%A2)
### Customized Schema Description Instructions
```json
{
"instruction": "You are an expert specializing in entity extraction. Please extract entities that comply with the defined schema from the input; return an empty list for non-existent entity types. Please respond in JSON string format.",
"schema": {
"Position": "The entity type describes the occupation or official position of an individual or group, including specific role names such as 'producer', 'scorekeeper', 'ascetic', 'oil painter'.",
"Attraction": "The entity type of attraction includes buildings, museums, memorials, art galleries, rivers, peaks, etc. Representative entities include the Pentagon, Tate Modern, Zheng Chenggong Memorial Hall, Duxi Palace, Barikasa, Robo River, Gunung Batur, Yugong Yishan LIVE, Xu Beihong Memorial Hall, Madame Tussauds, etc.",
"Company": "Company is an entity type representing any legal entity or business organization. This type of entity can be a catering group, manufacturer, retailer, hotel, bank, design institute, etc. Examples include: 'Shangri-La Hotel Group', 'JVC', 'Shanghai Coolray Professional eSports Peripheral Store', 'K2•Haitang Bay', 'Wuhan Iron and Steel', 'louisvuitton', 'Bank of Scotland', 'Beijing Institute of Architectural Design', '7 Days Inn', 'Vanke Group'.",
"Address": "Address entities refer to entities with geographical location information, representing specific places such as a country, city, region, street, or abstract geographic areas. Examples include: 'the river dock at the southeast tip of downtown Manhattan', 'Tuapse', 'Venice, Italy', 'Huzhou Hot Spring Golf Course', 'North Carolina', 'Beijing-Tianjin region', 'Happy Internet Cafe', 'Yinian Nursing Home', 'Shangtang Town Pudong', 'Inner Mongolia Autonomous Region Chifeng City', etc.",
"Organization": "Organizational entities refer to collective organizations such as companies, shops, clubs, schools, etc. They play a certain role in social and economic activities and have certain personality rights.",
"Movie": "Movie entities include titles of movies in Chinese or English, and sometimes also include names of characters in films."
},
"input": "It is difficult for me to imagine setting up another Haifishing Plaza. When we obtained this project, I just happened to be in Sanya."
}
```
<details>
<summary><b>Relation Extraction (RE) Description Instructions</b></summary>
```json
{
"instruction": "You are an expert specializing in relation extraction. Please extract triples that match the defined schema from the input; return an empty list for non-existent relations. Please respond in JSON string format.",
"schema": {
"Ethnicity": "Ethnicity",
"Alma Mater": "This type of relationship describes the connection between a person and their alma mater; the person is the subject, and the alma mater is the object. By identifying the names of people and schools in the text and analyzing the relationship of graduation between them based on word combinations and contextual information.",
"Lead Actor": "This is a type of relationship that describes the connection between a film or television work and its main actors; the subject is the film or television work and the object is the actor. In a valid 'Lead Actor' relationship, the actor (object) plays an important role in the work (subject).",
"Father": "This type of relationship is used to indicate the kinship between a father and a child, where the father is the birth parent or caregiver of the child. In the triple, the subject of the 'Father' relation type is the child, and the object is the father."
},
"input": "Throughout history, all those who have portrayed the character 'Chu Liuxiang' from Gu Long's novels are recognized as handsome men in the entertainment industry. In 2011, 36-year-old Zhang Zhiyao played Chu Liuxiang in 'The New Adventures of Chu Liuxiang', remaining irresistibly handsome."
}
```
</details>
<details>
<summary><b>Event Extraction (EE) Description Instructions</b></summary>
```json
{
"instruction": "You are an expert specializing in event extraction. Please extract events that match the schema definition from the input; return an empty list for non-existent events, NAN for non-existent arguments, and a list if there are multiple values for an argument. Please respond in JSON string format.",
"schema": {
"Finance/Trading - Listing": {
"Finance/Trading - Listing": "The act of a financial entity being listed on the stock market mainly involves companies, stocks, etc. Positive examples include specific information about a company or stock listing, while negative examples are unrelated to such activities.",
"trigger": true,
"arguments": {
"Financing Amount": "Refers to the total amount of funds raised by a company in a listing event. It sums up the revenue of all share issues and is measured in currency, including but not limited to units like 'billion', 'million', 'dollars', 'RMB', etc.",
"Time": "Describes the specific time of the listing event, which can be a specific date or relative time, and may also include location information and specific days and weeks.",
"Listing Enterprise": "Refers to the company or enterprise that is conducting an IPO or has already been listed on the trading market in a listing event. Examples include: 'Shanghai Henlius Biotech', 'Three Squirrels', 'Baoxin Software', 'Little Bear Electric', 'Jinshang Bank', 'Beyond Meat (BYND)', 'DouYu gaming live-streaming platform', 'fast food empire', and 'autonomous driving lidar manufacturer Velodyne', etc.",
"Location": "The specific location of the financial or trading event, such as a city, building, or room."
}
},
"Organizational Relationship - Resignation/Departure": {
"Organizational Relationship - Resignation/Departure": "The event type 'Organizational Relationship - Resignation/Departure' refers to changes in the relationship between individuals or organizational members and their organization, mainly including 'resignation', 'requesting to resign', 'stepping down', 'leaving the team', 'retirement', 'leaving', etc. Often occurs in scenarios of high-level personnel changes, government officials changes, or athletes transfers. Examples: 'Li Nan announced resignation', 'Yu Xubo resigned from the position of chairman of the board just three months after taking office, Chen Lang succeeded'.",
"trigger": true,
"arguments": {
"Resigner": "Refers to the individual or group who actively or passively leaves their original position or job post in an organizational relationship resignation/departure event. It can be one person or a group of people, such as: 'Finance Minister', '90s born guy from Shaoyang Longhui, Ouyang En and', 'Xiong Xiaoge', '*ST Changsheng two deputy general managers', 'Yang Tao', 'pilot Ma Qiang', 'HE WEI', '5 Baidu executives', 'Youxin Group COO Peng Weilian', 'Jianke Institute securities representative Shu Yanming', etc.",
"Time": "Indicates the specific point in time or period when the resignation/departure event occurred, generally including specific dates, weeks, times, etc., like 'September 19', 'the evening of June 29', 'this Saturday', '10:30 AM on July 9', 'the morning of June 12', 'April 9', 'September 10', 'local time on Sunday', 'September 12', '10 AM on October 15', etc."
}
},
"Finance/Trading - Interest Rate Increase": {
"Finance/Trading - Interest Rate Increase": "This event describes banks or financial institutions raising interest rates to tighten the money supply. The typical trigger word is 'hike'. 'Hike' indicates the occurrence of the Finance/Trading - Interest Rate Increase event.",
"trigger": true,
"arguments": {
"Rate of Increase": "The rate of increase is usually presented as a percentage or basis points, indicating the degree or range of the interest rate hike in the event. Examples include: 'to 5.75%', '25 basis points', 'the benchmark rate from 0.25% up to 0.5%', '25 basis points'.",
"Hiking Institution": "The hiking institution is the financial institution with the authority to determine or implement the interest rate hike policy in a Finance/Trading - Interest Rate Increase event, such as central banks from different countries (e.g., Bank of England, Federal Reserve, European Central Bank) or financial institutions (e.g., Bank of England).",
"Time": "Indicates the specific date or time period when the Finance/Trading - Interest Rate Increase event occurred, such as 'the morning of June 18th', 'January 24th', 'three months later', etc. The specific expression includes time accurate to the minute, such as '11:00 on December 28, 2018', relative time, such as 'yesterday (2nd)', and special time expressions like 'Mid-Autumn Festival'."
}
},
"Organizational Relationship - Contract Termination": {
"Organizational Relationship - Contract Termination": "Situations of contract cancellation or termination usually occur in the business, entertainment, or sports domains. Trigger words include 'leave', 'trade', 'cut', 'contract expiry', 'contract termination', 'sell-off', 'release', 'send out', 'contract break', etc. Positive examples include 'Peng Yuchang terminates his contract' and 'Jiang Mengjie nearly bankrupt after contract termination'. Negative examples are like 'Federer withdrew from the competition'.",
"trigger": true,
"arguments": {
"Party Being Terminated": "In an organizational relationship contract termination event, the role is the party whose agreement or contract relation is being dissolved, and might be an individual or an organization, such as an athlete, film producer, company, etc. For instance, 'seven-time All-Star Joe Johnson', 'the production side of 'A Little Wish'', 'Raptors', 'Samsung', etc."
}
}
},
"input": "News from August 20th, according to Tencent News 'Frontline' report, informed sources stated that in order to control cost expenditure, NIO plans to reduce the number of staff at its U.S. branch, excluding those involved in the autonomous driving business, to about 200. As of August 16th, U.S. time, NIO's Silicon Valley branch had cut 100 employees."
}
```
</details>
<details>
<summary><b>Knowledge Graph Construction (KGC) Description Instructions</b></summary>
```json
{
"instruction": "You are an expert in structuring knowledge about graph entities. Based on the schema description for the input entity type, extract the corresponding entity instances and their attribute information from the text; do not output non-existent attributes, return a list for attributes with multiple values, and provide the output in a parseable JSON format.",
"schema": [
{
"entity_type": "Person",
"attributes": {
"Chinese Name": "The Chinese name of the person",
"English Name": "The English name of the person",
"Ancestral Home": "The ancestral address of the person",
"Date of Birth": "Birthday, birth date",
"Place of Birth": "The place of birth, administrative region",
"Occupation": "The occupation, position, identity of the person",
"Alma Mater": "The middle school, university, college from which the person graduated",
"Works": "Albums, songs, novels, published books, participated film and television works, etc.",
"Awards": "Various awards and honors received by the person"
}
}
],
"input": "Jay Chou (Jay Chou), born on January 18, 1979, in New Taipei City, Taiwan Province, with ancestral home in Yongchun County, Quanzhou City, Fujian Province, is a Chinese pop musician, actor, director, and screenwriter. He graduated from Tamkang High School. In 2000, he released his debut music album 'Jay.' In 2001, he cemented his fusion style of Eastern and Western music with the album 'Fantasy.' In 2002, he held 'The One' world tour; that same year, he won the Best Composer award at the 13th Taiwan Golden Melody Awards for the song 'Love Before the Century.'"
}
```
</details>
### Customized Example Instructions
Given that example instances can often be lengthy, and due to the limited maximum length of model training, too many examples may inversely affect model performance. Therefore, we suggest providing 2 examples: one positive and one negative, while keeping the number of schemas to one.
```json
{
"instruction": "You are an expert in entity extraction. Please extract entities from the input that fit the defined schema; return an empty list for non-existent entity types. Please respond in the format of a JSON string. You may refer to the example to guide your extraction.",
"schema": [
"Biomarker"
],
"example": [
{
"input": "Diagnostic criteria for CKD include: 1. Any of the following indicators persisting for more than 3 months; and meeting at least one criterion.(1) Signs of renal damage: Albuminuria [Albumin excretion rate (AER)≥30mg/24h; Albumin to creatinine ratio (ACR)≥3mg/mmol]; abnormal urinary sediment; tubular pathology; histological anomalies; structural abnormities found in imaging; history of kidney transplantation.(2) Decline in glomerular filtration rate: eGFR≤60ml·min-1·1.73m-2",
"output": {
"Biomarker": [
"Albumin excretion rate (AER)",
"Albumin to creatinine ratio (ACR)",
"Glomerular filtration rate",
"eGFR"
]
}
},
{
"input": "Application of DPP-4 inhibitors in specific populations",
"output": {
"Biomarker": []
}
}
],
"input": "Currently, all sulfonylurea drugs' leaflets list severe liver dysfunction as a contraindication. Alanine transaminase (ALT)> 3 times the upper limit of the reference value can serve as a sensitive and specific indicator of liver damage. If ALT>8-10 times the upper limit of the reference value or ALT>3 times with total serum bilirubin (TBIL)>2 times the reference value, it is considered a specific predictor of severe liver damage, indicating substantial injury to hepatic parenchymal cells; sulfonylureas should be contraindicated at this stage. Clinically, patients with decompensated liver cirrhosis accompanied by hepatic encephalopathy, ascites, or coagulation disorders should avoid this class of drugs to prevent hypoglycemia."
}
```
<details>
<summary><b>Relationship Extraction (RE) Example Instruction</b></summary>
```json
{
"instruction": "You are an expert specialized in relationship extraction. Please extract from the input the defined relation triples according to the schema; return an empty list for non-existent relations. Please respond in the format of a JSON string. You may refer to the example for guidance on extraction.",
"schema": [
"Disease Staging and Typing"
],
"example": [
{
"input": "The foundational treatment of diabetes includes both education and management, as well as diet and exercise. A lack of knowledge in diabetes prevention and control is the primary reason for poor blood sugar management. Paying attention to the education and management of elderly patients is an important measure to improve the treatment level of diabetes.",
"output": {
"Disease Staging and Typing": []
}
},
{
"input": "Metabolites of glipizide have no hypoglycemic effect and are mostly excreted through feces, with only 5.0% excreted by the kidneys, thus are less affected by renal function. However, large clinical trials in patients with chronic kidney disease are limited. There have been studies observing the use of glipizide in patients with GFR10~50 ml min-1.(1.73m2)-1, but the trial designs are not perfect. Glipizide can be used in patients with stages 1 to 3 chronic kidney disease without dose adjustment; caution is advised in stage 4; and it is contraindicated in stage 5.",
"output": {
"Disease Staging and Typing": [
{
"subject": "Chronic kidney disease",
"object": "Chronic"
},
{
"subject": "Chronic kidney disease",
"object": "Chronic"
},
{
"subject": "Chronic kidney disease",
"object": "stages 1 to 3"
},
{
"subject": "Chronic kidney disease",
"object": "stage 4"
},
{
"subject": "Chronic kidney disease",
"object": "stage 5"
}
]
}
}
],
"input": "(2)NSAIDs: This includes both non-selective cyclooxygenase (COX) inhibitors and COX-2 inhibitors. If there are no contraindications, early and ample use of fast-acting NSAID formulations is recommended. Non-selective COX inhibitors primarily have gastrointestinal adverse reactions such as ulcers, perforations, and upper gastrointestinal bleeding, hence COX-2 inhibitors, which can reduce GI reactions by 50%, may be used for those intolerant to non-selective COX inhibitors. Active gastrointestinal ulcers/bleeding or a history of recurrent gastrointestinal ulcers/bleeding is a contraindication for all NSAIDs use. COX-2 inhibitors may increase the risk of cardiovascular events and should be avoided in patients with myocardial infarction or heart failure. Kidney function monitoring is required during the use of NSAIDs, and their use is not recommended in patients with severe chronic kidney disease (stages G4 to G5) who are not undergoing dialysis."
}
```
</details>
<details>
<summary><b>Event Extraction (EE) Example Instruction</b></summary>
```json
{
"instruction": "You are an expert specialized in event extraction. Please extract events from the input according to the defined schema; return an empty list for non-existent events, and 'NAN' for non-existent arguments. If an argument has multiple values, please return a list. Respond in the format of a JSON string. You may refer to the example for extraction guidance.",
"schema": [
{
"event_type": "Corporate Financing",
"trigger": true,
"arguments": [
"Disclosure Time",
"Investee",
"Financing Round",
"Lead Investor",
"Event Time",
"Investor",
"Financing Amount"
]
}
],
"example": [
{
"input": "Raise 2.5 billion yuan for expansion due to the 'three highs' condition of Joyson Electronics: high pledges, high goodwill, high debt\nReporter Zhang Jiazhen, from Beijing\nNingbo Joyson Electronic Corporation (hereinafter referred to as 'Joyson Electronics', 600699.SH), which holds billion-level big orders, is actively raising funds to expand production capacity to ease the increasingly pressing bottleneck of production capacity saturation.\nRecently, Joyson Electronics announced that it has received the 'Feedback Notice' from the China Securities Regulatory Commission, and its private stock offering is a step closer to approval.",
"output": {
"Corporate Financing": [
{
"trigger": "Raise",
"arguments": {
"Disclosure Time": "NAN",
"Investee": "Ningbo Joyson Electronic Corporation",
"Financing Round": "NAN",
"Lead Investor": "NAN",
"Event Time": "NAN",
"Investor": "NAN",
"Financing Amount": "2.5 billion yuan"
}
}
]
}
},
{
"input": "NIO stock falls to 13% before market; NIO reports over 3.2 billion loss in Q2\nOriginal Title: NIO stock falls to 13% before market; NIO reports over 3.2 billion loss in Q2\nNIO's stock price turned from a rise to a fall before market, falling to 13%. NIO released its Q2 earnings today, followed by the announcement of the cancellation of the earnings conference call originally scheduled for today.\nThe earnings report showed that NIO achieved a revenue of 1.508 billion yuan in the second quarter, exceeding market expectations of 1.309 billion yuan, compared to 46 million yuan in the same period last year; The net loss attributable to shareholders in the second quarter was 3.285 billion yuan, higher than the market expected loss of 2.944 billion yuan, compared to a loss of 6.11 billion yuan in the same period last year.",
"output": {
"Corporate Financing": []
}
}
],
"input": "【Exclusive】The 11th in five years, Codemao announces completion of C+ round financing of 250 million yuan\nJiemodui, April 17th - Today, Codemao announced the completion of a C+ round of financing worth 250 million yuan.\nThis comes five months after completing a C round financing of 400 million yuan last year, which is the new round of 'ammunition' added by Codemao.\nThe round was led by China Merchants International, with Bohai Capital, an equity investment fund under Bank of China Group, and existing shareholders Yueke Xintai and Shengyu Investment following suit."
}
```
</details>
## Evaluation
To extract structured content from the output text and to assess it, please refer to [DeepKE-llm/InstructKGC/README_CN.md/7.评估](https://github.com/zjunlp/DeepKE/blob/main/example/llm/InstructKGC/README_CN.md/#-7%E8%AF%84%E4%BC%B0).
## Continue Training
To continue training OneKE, refer to [DeepKE-llm/InstructKGC/4.9领域内数据继续训练](https://github.com/zjunlp/DeepKE/blob/main/example/llm/InstructKGC/README_CN.md/#49%E9%A2%86%E5%9F%9F%E5%86%85%E6%95%B0%E6%8D%AE%E7%BB%A7%E7%BB%AD%E8%AE%AD%E7%BB%83).
## Citation
If you have used OneKE in your work, please kindly cite the following paper:
```bibtex
@article{DBLP:journals/corr/abs-2402-14710,
author = {Honghao Gui and
Lin Yuan and
Hongbin Ye and
Ningyu Zhang and
Mengshu Sun and
Lei Liang and
Huajun Chen},
title = {IEPile: Unearthing Large-Scale Schema-Based Information Extraction
Corpus},
journal = {CoRR},
volume = {abs/2402.14710},
year = {2024},
url = {https://doi.org/10.48550/arXiv.2402.14710},
doi = {10.48550/ARXIV.2402.14710},
eprinttype = {arXiv},
eprint = {2402.14710},
timestamp = {Tue, 09 Apr 2024 07:32:43 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2402-14710.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` | [
"NAMED_ENTITY_RECOGNITION",
"RELATION_EXTRACTION",
"EVENT_EXTRACTION"
]
| [
"BEAR"
]
| Non_BioNLP |
<p align="center">
<a href="https://github.com/zjunlp/deepke"> <img src="assets/oneke_logo.png" width="400"/></a>
<p>
<p align="center">
<a href="https://oneke.openkg.cn/">
<img alt="Documentation" src="https://img.shields.io/badge/demo-website-blue">
</a>
<a href="https://pypi.org/project/deepke/#files">
<img alt="PyPI" src="https://img.shields.io/pypi/v/deepke">
</a>
<a href="https://github.com/zjunlp/DeepKE/blob/master/LICENSE">
<img alt="GitHub" src="https://img.shields.io/github/license/zjunlp/deepke">
</a>
<a href="http://zjunlp.github.io/DeepKE">
<img alt="Documentation" src="https://img.shields.io/badge/doc-website-red">
</a>
</p>
<h1 align="center">
<p>OneKE: A Bilingual Large Language Model for <br>Knowledge Extraction</p>
</h1>
- [What is OneKE?](#what-is-oneke)
- [How is OneKE trained?](#how-is-oneke-trained)
- [Getting Started with OneKE](#getting-started-with-oneke)
- [Quick Start](#quick-start)
- [Advanced Use of OneKE](#advanced-use-of-oneke)
- [OneKE Instruction Format](#oneke-instruction-format)
- [Conversion of OneKE Instruction Format](#conversion-of-oneke-instruction-format)
- [Customized Schema Description Instructions](#customized-schema-description-instructions)
- [Evaluation](#evaluation)
- [Continue Training](#continue-training)
- [Citation](#citation)
## What is OneKE?
OneKE is a large-scale model framework for knowledge extraction jointly developed by Ant Group and Zhejiang University. It possesses the capability of generalized knowledge extraction in bilingual Chinese and English, across multiple domains and tasks, and provides comprehensive toolchain support. OneKE has contributed to the OpenKG open knowledge graph community in an open-source manner.
Knowledge construction based on unstructured documents has always been one of the key challenges for the large-scale implementation of knowledge graphs. The high fragmentation and unstructured nature of real-world information, along with the substantial disparities between extracted content and its natural language expression, often result in the suboptimal performance of large language models in information extraction tasks. Natural language text often contains ambiguities, polysemies, and metaphors due to implicit and long-distance context associations, posing significant challenges for knowledge extraction tasks. In response to these issues, Ant Group and Zhejiang University leveraged their years of expertise in knowledge graphs and natural language processing to jointly construct and upgrade the capabilities of Ant's large-scale model "BaiLing" in the field of knowledge extraction. They released the bilingual knowledge extraction framework OneKE which included a version based on full parametric fine-tuning of Chinese-Alpaca-2-13B. Evaluation metrics show that OneKE has achieved relatively good performance on several fully supervised and zero-shot entity/relation/event extraction tasks.
The unified knowledge extraction framework has wide application scenarios and can significantly reduce the construction costs of domain-specific knowledge graphs. By extracting structured knowledge from massive datasets to construct high-quality knowledge graphs and establish logical associations between knowledge elements, interpretable inference and decision-making can be realized. It can also enhance large models by mitigating hallucination and boosting stability, accelerating the vertical domain applications of large models. For example, in the medical field, knowledge extraction can be used to convert doctors' experience into structured, rule-based management, building controlled auxiliary diagnostics, and medical Q&A systems. In the financial sector, it can extract financial indicators, risk events, causal logic, and industry chains for automated financial report generation, risk prediction, and industry chain analysis. In the public sector, it can facilitate knowledge-based management of government regulations, enhancing the efficiency and accuracy of public services.
<p align="center" width="100%">
<a href="" target="_blank"><img src="assets/oneke.gif" alt="OneKE" style="width: 100%; min-width: 20px; display: block; margin: auto;"></a>
</p>
## How is OneKE trained?
OneKE mainly focuses on schema-generalizable information extraction. Due to issues such as non-standard formats, noisy data, and lack of diversity in existing extraction instruction data, OneKE adopted techniques such as normalization and cleaning of extraction instructions, difficult negative sample collection, and schema-based batched instruction construction, as shown in the illustration. For more detailed information, refer to the paper "[IEPile: Unearthing Large-Scale Schema-Based Information Extraction Corpus](https://arxiv.org/abs/2402.14710) [[Github](https://github.com/zjunlp/IEPile)]".
The zero-shot generalization comparison results of OneKE with other large models are as follows:
* `NER-en`: CrossNER_AI, CrossNER_literature, CrossNER_music, CrossNER_politics, CrossNER_science
* `NER-zh`: WEIBONER, boson
* `RE-zh`: COAE2016, IPRE, SKE2020
* `RE-en`: FewRel, Wiki-ZSL
* `EE-en`: CrudeOilNews, WikiEvents, RAMS
* `EE-zh`: FewFC, CCF Law
<p align="center" width="50%">
<a href="" target="_blank"><img src="assets/oneke_results.png" alt="OneKE" style="width: 50%; min-width: 20px; display: block; margin: auto;"></a>
</p>


<details>
<summary><b>Supervision Results</b></summary>



</details>
## Getting Started with OneKE
### Quick Start
It is recommended to have at least **20GB of VRAM** for training and inferencing.
```python
import torch
from transformers import (
AutoConfig,
AutoTokenizer,
AutoModelForCausalLM,
GenerationConfig,
BitsAndBytesConfig
)
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
model_path = 'zjunlp/OneKE'
config = AutoConfig.from_pretrained(model_path, trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True)
# 4-bit Quantized OneKE
quantization_config=BitsAndBytesConfig(
load_in_4bit=True,
llm_int8_threshold=6.0,
llm_int8_has_fp16_weight=False,
bnb_4bit_compute_dtype=torch.bfloat16,
bnb_4bit_use_double_quant=True,
bnb_4bit_quant_type="nf4",
)
model = AutoModelForCausalLM.from_pretrained(
model_path,
config=config,
device_map="auto",
quantization_config=quantization_config,
torch_dtype=torch.bfloat16,
trust_remote_code=True,
)
model.eval()
system_prompt = '<<SYS>>\nYou are a helpful assistant. 你是一个乐于助人的助手。\n<</SYS>>\n\n'
sintruct = "{\"instruction\": \"You are an expert in named entity recognition. Please extract entities that match the schema definition from the input. Return an empty list if the entity type does not exist. Please respond in the format of a JSON string.\", \"schema\": [\"person\", \"organization\", \"else\", \"location\"], \"input\": \"284 Robert Allenby ( Australia ) 69 71 71 73 , Miguel Angel Martin ( Spain ) 75 70 71 68 ( Allenby won at first play-off hole )\"}"
sintruct = '[INST] ' + system_prompt + sintruct + '[/INST]'
input_ids = tokenizer.encode(sintruct, return_tensors="pt").to(device)
input_length = input_ids.size(1)
generation_output = model.generate(input_ids=input_ids, generation_config=GenerationConfig(max_length=1024, max_new_tokens=512, return_dict_in_generate=True))
generation_output = generation_output.sequences[0]
generation_output = generation_output[input_length:]
output = tokenizer.decode(generation_output, skip_special_tokens=True)
print(output)
```
For more detailed inference, please refer to [DeepKE-llm/InstructKGC/6.1.2IE专用模型](https://github.com/zjunlp/DeepKE/blob/main/example/llm/InstructKGC/README_CN.md/#612ie%E4%B8%93%E7%94%A8%E6%A8%A1%E5%9E%8B).
### Advanced Use of OneKE
### OneKE Instruction Format
The instructions in OneKE are formatted in a dictionary-type string similar to JSON. It consists of three fields:
(1) **`'instruction'`**, which is the task description, specifies in natural language the role the model plays and the task to be completed;
(2) **`'schema'`**, a list of labels to be extracted, clearly indicates the key fields of the information to be extracted, reflecting the user's needs, and is dynamic and changeable;
(3) **`'input'`**, refers to the source text for information extraction.
Below are examples of instructions for various tasks:
<details>
<summary><b>Named Entity Recognition (NER)</b></summary>
```json
{
"instruction": "You are an expert specializing in entity extraction. Please extract entities that comply with the schema definition from the input; return an empty list for non-existent entity types. Please respond in the JSON string format.",
"schema": ["Person Name", "Education", "Position", "Nationality"],
"input": "Mr. Liu Zhijian: Born in 1956, Chinese nationality, no permanent residency abroad, member of the Communist Party, associate degree, senior economist."
}
```
</details>
<details>
<summary><b>Relation Extraction (RE)</b></summary>
```json
{
"instruction": "You are an expert specializing in relation extraction. Please extract relationship triples that comply with the schema definition from the input; return an empty list for non-existent relationships. Please respond in the JSON string format.",
"schema": ["Father", "Husband", "Postal Code", "Mother"],
"input": "Ding Long took out his life savings of $12,000, which without a doubt was a substantial amount at the end of the 19th century, plus Carpentier's donation, they both funded Columbia University's sinology research together."
}
```
</details>
<details>
<summary><b>Knowledge Graph Construction (KGC)</b></summary>
```json
{
"instruction": "You are an expert in structuring knowledge about graph entities. Based on the schema description of the input entity type, extract the corresponding entity instances and their property information from the text; do not output non-existent properties, return a list if there are multiple values for a property, and provide the output in a parseable json format.",
"schema": [
{
"entity_type": "Person",
"attributes": ["Chinese Name", "English Name", "Ancestral Home", "Date of Birth", "Place of Birth", "Occupation", "Alma Mater", "Works", "Awards"]
}
],
"input": "Jay Chou (Jay Chou), born on January 18, 1979, in New Taipei City, Taiwan Province, ancestral home in Yongchun County, Quanzhou City, Fujian Province, Chinese pop singer, musician, actor, director, screenwriter, graduated from Tamkang High School. In 2000, he released his debut album 'Jay'. In 2001, he cemented his style of blending Eastern and Western music with the album 'Fantasy'. In 2002, he held ‘The One’ world tour; the same year, he won the Best Composer award at the 13th Taiwan Golden Melody Awards with the song 'Love Before the Century'."
}
```
</details>
<details>
<summary><b>Event Extraction (EE)</b></summary>
```json
{
"instruction": "You are an expert specializing in event extraction. Please extract events that match the defined schema from the input; return an empty list for non-existent events, NAN for non-existent arguments, and a list if there are multiple values for an argument. Please provide your response in JSON string format.",
"schema": [
{
"event_type": "Finance/Trading - Interest Rate Hike",
"trigger": true,
"arguments": [
"Time"
]
},
{
"event_type": "Finance/Trading - Interest Rate Cut",
"trigger": true,
"arguments": [
"Cut Magnitude"
]
},
{
"event_type": "Finance/Trading - Price Increase",
"trigger": true,
"arguments": [
"Price Raiser"
]
},
{
"event_type": "Finance/Trading - Price Cut",
"trigger": true,
"arguments": [
"Price Cutter",
"Time"
]
}
],
"input": "AI risk control solution provider Vezetech secures tens of millions of dollars in Series C+ funding"
}
```
</details>
<details>
<summary><b>Event Trigger Identification (EET)</b></summary>
```json
{
"instruction": "You are an expert specializing in event trigger identification. Please extract the event types and triggers that match the defined schema from the input; return an empty list if the event type doesn't exist. Please provide your response in JSON string format.",
"schema": ["Organizational Relationship - Dissolve", "Organizational Relationship - Layoff", "Organizational Relationship - Dismiss", "Competition Behavior - Promotion"],
"input": "Nestlé lays off 4,000 employees: When the times leave you behind, they won't even say goodbye!"
}
```
</details>
<details>
<summary><b>Event Argument Extraction (EEA)</b></summary>
```json
{
"instruction": "You are an expert specializing in event argument extraction. Please extract the event arguments and their roles that match the defined schema from the input; return NAN or an empty dictionary for non-existent arguments, and a list if there are multiple values for an argument. Please provide your response in JSON string format.",
"schema": [{"event_type": "Organizational Relationship - Resignation/Departure", "arguments": ["Resigner", "Time", "Former Organization"]}],
"input": "Nestlé lays off 4,000 employees: When the times leave you behind, they won't even say goodbye!"
}
```
</details>
> Note: In consideration of the complexity of information extraction within specific domains and the high reliance on prompts, we support the integration of Schema descriptions and examples in the instructions to enhance the effectiveness of extraction tasks. For details, refer to **`Customized Schema Description Instructions`** and **`Customized Example Instructions`**. Please understand that due to the limited scale of the model, the model output is prompt-dependent and different prompts may yield inconsistent results.
### Conversion of OneKE Instruction Format
**List of Instructions**:
```python
instruction_mapper = {
'NERzh': "你是专门进行实体抽取的专家。请从input中抽取出符合schema定义的实体,不存在的实体类型返回空列表。请按照JSON字符串的格式回答。",
'REzh': "你是专门进行关系抽取的专家。请从input中抽取出符合schema定义的关系三元组,不存在的关系返回空列表。请按照JSON字符串的格式回答。",
'EEzh': "你是专门进行事件提取的专家。请从input中抽取出符合schema定义的事件,不存在的事件返回空列表,不存在的论元返回NAN,如果论元存在多值请返回列表。请按照JSON字符串的格式回答。",
'EETzh': "你是专门进行事件提取的专家。请从input中抽取出符合schema定义的事件类型及事件触发词,不存在的事件返回空列表。请按照JSON字符串的格式回答。",
'EEAzh': "你是专门进行事件论元提取的专家。请从input中抽取出符合schema定义的事件论元及论元角色,不存在的论元返回NAN或空字典,如果论元存在多值请返回列表。请按照JSON字符串的格式回答。",
'KGzh': '你是一个图谱实体知识结构化专家。根据输入实体类型(entity type)的schema描述,从文本中抽取出相应的实体实例和其属性信息,不存在的属性不输出, 属性存在多值就返回列表,并输出为可解析的json格式。',
'NERen': "You are an expert in named entity recognition. Please extract entities that match the schema definition from the input. Return an empty list if the entity type does not exist. Please respond in the format of a JSON string.",
'REen': "You are an expert in relationship extraction. Please extract relationship triples that match the schema definition from the input. Return an empty list for relationships that do not exist. Please respond in the format of a JSON string.",
'EEen': "You are an expert in event extraction. Please extract events from the input that conform to the schema definition. Return an empty list for events that do not exist, and return NAN for arguments that do not exist. If an argument has multiple values, please return a list. Respond in the format of a JSON string.",
'EETen': "You are an expert in event extraction. Please extract event types and event trigger words from the input that conform to the schema definition. Return an empty list for non-existent events. Please respond in the format of a JSON string.",
'EEAen': "You are an expert in event argument extraction. Please extract event arguments and their roles from the input that conform to the schema definition, which already includes event trigger words. If an argument does not exist, return NAN or an empty dictionary. Please respond in the format of a JSON string.",
'KGen': 'You are an expert in structured knowledge systems for graph entities. Based on the schema description of the input entity type, you extract the corresponding entity instances and their attribute information from the text. Attributes that do not exist should not be output. If an attribute has multiple values, a list should be returned. The results should be output in a parsable JSON format.',
}
```
Recommended **Split Numbers** for Each Task:
```python
split_num_mapper = {
'NER':6, 'RE':4, 'EE':4, 'EET':4, 'EEA':4, 'KG':1
}
```
Since predicting all schemas in the label set at once is too challenging and not easily scalable, OneKE uses a batched approach during training. It divides the number of schemas asked in the instructions, querying a fixed number of schemas at a time. Hence, if the label set of a piece of data is too long, it will be split into multiple instructions that the model will address in turns.
**Schema Format**:
```python
NER: ["Person Name", "Education", "Position", "Nationality"] # List of strings
RE: ["Father", "Husband", "Postal Code", "Mother"] # List of strings
EE: [{"event_type": "Finance/Trading - Interest Rate Hike", "trigger": True, "arguments": ["Time"]}, {"event_type": "Finance/Trading - Interest Rate Cut", "trigger": True, "arguments": ["Cut Magnitude"]}] # List of dictionaries, "event_type" is a string, "trigger" is a bool, "arguments" is a list
EET: ["Organizational Relationship - Dissolution", "Organizational Relationship - Layoff", "Organizational Relationship - Dismissal", "Competition Behavior - Advancement"] # List of strings
EEA: [{"event_type": "Finance/Trading - Interest Rate Hike", "arguments": ["Time"]}, {"event_type": "Finance/Trading - Interest Rate Cut", "arguments": ["Cut Magnitude"]}] # List of dictionaries, "event_type" is a string, "arguments" is a list
```
Below is a simple Batched Instruction Generation script:
```python
def get_instruction(language, task, schema, input):
sintructs = []
split_num = split_num_mapper[task]
if type(schema) == dict:
sintruct = json.dumps({'instruction':instruction_mapper[task+language], 'schema':schema, 'input':input}, ensure_ascii=False)
sintructs.append(sintruct)
else:
split_schemas = [schema[i:i+split_num] for i in range(0, len(schema), split_num)]
for split_schema in split_schemas:
sintruct = json.dumps({'instruction':instruction_mapper[task+language], 'schema':split_schema, 'input':input}, ensure_ascii=False)
sintructs.append(sintruct)
return sintructs
```
Below is an example using the aforementioned simple script:
```python
task = 'NER'
language = 'en'
schema = ['person', 'organization', 'else', 'location']
split_num = split_num_mapper[task]
split_schemas = [schema[i:i+split_num] for i in range(0, len(schema), split_num)]
input = '284 Robert Allenby ( Australia ) 69 71 71 73 , Miguel Angel Martin ( Spain ) 75 70 71 68 ( Allenby won at first play-off hole )'
sintructs = []
for split_schema in split_schemas:
sintruct = json.dumps({'instruction':instruction_mapper[task+language], 'schema':split_schema, 'input':input}, ensure_ascii=False)
sintructs.append(sintruct)
```
> '{"instruction": "You are an expert in named entity recognition. Please extract entities that match the schema definition from the input. Return an empty list if the entity type does not exist. Please respond in the format of a JSON string.", "schema": ["person", "organization", "else", "location"], "input": "284 Robert Allenby ( Australia ) 69 71 71 73 , Miguel Angel Martin ( Spain ) 75 70 71 68 ( Allenby won at first play-off hole )"}'
For more detailed data conversion, please refer to [DeepKE-llm/InstructKGC/README_CN.md/2.3测试数据转换](https://github.com/zjunlp/DeepKE/blob/main/example/llm/InstructKGC/README_CN.md/#23%E6%B5%8B%E8%AF%95%E6%95%B0%E6%8D%AE%E8%BD%AC%E6%8D%A2)
### Customized Schema Description Instructions
```json
{
"instruction": "You are an expert specializing in entity extraction. Please extract entities that comply with the defined schema from the input; return an empty list for non-existent entity types. Please respond in JSON string format.",
"schema": {
"Position": "The entity type describes the occupation or official position of an individual or group, including specific role names such as 'producer', 'scorekeeper', 'ascetic', 'oil painter'.",
"Attraction": "The entity type of attraction includes buildings, museums, memorials, art galleries, rivers, peaks, etc. Representative entities include the Pentagon, Tate Modern, Zheng Chenggong Memorial Hall, Duxi Palace, Barikasa, Robo River, Gunung Batur, Yugong Yishan LIVE, Xu Beihong Memorial Hall, Madame Tussauds, etc.",
"Company": "Company is an entity type representing any legal entity or business organization. This type of entity can be a catering group, manufacturer, retailer, hotel, bank, design institute, etc. Examples include: 'Shangri-La Hotel Group', 'JVC', 'Shanghai Coolray Professional eSports Peripheral Store', 'K2•Haitang Bay', 'Wuhan Iron and Steel', 'louisvuitton', 'Bank of Scotland', 'Beijing Institute of Architectural Design', '7 Days Inn', 'Vanke Group'.",
"Address": "Address entities refer to entities with geographical location information, representing specific places such as a country, city, region, street, or abstract geographic areas. Examples include: 'the river dock at the southeast tip of downtown Manhattan', 'Tuapse', 'Venice, Italy', 'Huzhou Hot Spring Golf Course', 'North Carolina', 'Beijing-Tianjin region', 'Happy Internet Cafe', 'Yinian Nursing Home', 'Shangtang Town Pudong', 'Inner Mongolia Autonomous Region Chifeng City', etc.",
"Organization": "Organizational entities refer to collective organizations such as companies, shops, clubs, schools, etc. They play a certain role in social and economic activities and have certain personality rights.",
"Movie": "Movie entities include titles of movies in Chinese or English, and sometimes also include names of characters in films."
},
"input": "It is difficult for me to imagine setting up another Haifishing Plaza. When we obtained this project, I just happened to be in Sanya."
}
```
<details>
<summary><b>Relation Extraction (RE) Description Instructions</b></summary>
```json
{
"instruction": "You are an expert specializing in relation extraction. Please extract triples that match the defined schema from the input; return an empty list for non-existent relations. Please respond in JSON string format.",
"schema": {
"Ethnicity": "Ethnicity",
"Alma Mater": "This type of relationship describes the connection between a person and their alma mater; the person is the subject, and the alma mater is the object. By identifying the names of people and schools in the text and analyzing the relationship of graduation between them based on word combinations and contextual information.",
"Lead Actor": "This is a type of relationship that describes the connection between a film or television work and its main actors; the subject is the film or television work and the object is the actor. In a valid 'Lead Actor' relationship, the actor (object) plays an important role in the work (subject).",
"Father": "This type of relationship is used to indicate the kinship between a father and a child, where the father is the birth parent or caregiver of the child. In the triple, the subject of the 'Father' relation type is the child, and the object is the father."
},
"input": "Throughout history, all those who have portrayed the character 'Chu Liuxiang' from Gu Long's novels are recognized as handsome men in the entertainment industry. In 2011, 36-year-old Zhang Zhiyao played Chu Liuxiang in 'The New Adventures of Chu Liuxiang', remaining irresistibly handsome."
}
```
</details>
<details>
<summary><b>Event Extraction (EE) Description Instructions</b></summary>
```json
{
"instruction": "You are an expert specializing in event extraction. Please extract events that match the schema definition from the input; return an empty list for non-existent events, NAN for non-existent arguments, and a list if there are multiple values for an argument. Please respond in JSON string format.",
"schema": {
"Finance/Trading - Listing": {
"Finance/Trading - Listing": "The act of a financial entity being listed on the stock market mainly involves companies, stocks, etc. Positive examples include specific information about a company or stock listing, while negative examples are unrelated to such activities.",
"trigger": true,
"arguments": {
"Financing Amount": "Refers to the total amount of funds raised by a company in a listing event. It sums up the revenue of all share issues and is measured in currency, including but not limited to units like 'billion', 'million', 'dollars', 'RMB', etc.",
"Time": "Describes the specific time of the listing event, which can be a specific date or relative time, and may also include location information and specific days and weeks.",
"Listing Enterprise": "Refers to the company or enterprise that is conducting an IPO or has already been listed on the trading market in a listing event. Examples include: 'Shanghai Henlius Biotech', 'Three Squirrels', 'Baoxin Software', 'Little Bear Electric', 'Jinshang Bank', 'Beyond Meat (BYND)', 'DouYu gaming live-streaming platform', 'fast food empire', and 'autonomous driving lidar manufacturer Velodyne', etc.",
"Location": "The specific location of the financial or trading event, such as a city, building, or room."
}
},
"Organizational Relationship - Resignation/Departure": {
"Organizational Relationship - Resignation/Departure": "The event type 'Organizational Relationship - Resignation/Departure' refers to changes in the relationship between individuals or organizational members and their organization, mainly including 'resignation', 'requesting to resign', 'stepping down', 'leaving the team', 'retirement', 'leaving', etc. Often occurs in scenarios of high-level personnel changes, government officials changes, or athletes transfers. Examples: 'Li Nan announced resignation', 'Yu Xubo resigned from the position of chairman of the board just three months after taking office, Chen Lang succeeded'.",
"trigger": true,
"arguments": {
"Resigner": "Refers to the individual or group who actively or passively leaves their original position or job post in an organizational relationship resignation/departure event. It can be one person or a group of people, such as: 'Finance Minister', '90s born guy from Shaoyang Longhui, Ouyang En and', 'Xiong Xiaoge', '*ST Changsheng two deputy general managers', 'Yang Tao', 'pilot Ma Qiang', 'HE WEI', '5 Baidu executives', 'Youxin Group COO Peng Weilian', 'Jianke Institute securities representative Shu Yanming', etc.",
"Time": "Indicates the specific point in time or period when the resignation/departure event occurred, generally including specific dates, weeks, times, etc., like 'September 19', 'the evening of June 29', 'this Saturday', '10:30 AM on July 9', 'the morning of June 12', 'April 9', 'September 10', 'local time on Sunday', 'September 12', '10 AM on October 15', etc."
}
},
"Finance/Trading - Interest Rate Increase": {
"Finance/Trading - Interest Rate Increase": "This event describes banks or financial institutions raising interest rates to tighten the money supply. The typical trigger word is 'hike'. 'Hike' indicates the occurrence of the Finance/Trading - Interest Rate Increase event.",
"trigger": true,
"arguments": {
"Rate of Increase": "The rate of increase is usually presented as a percentage or basis points, indicating the degree or range of the interest rate hike in the event. Examples include: 'to 5.75%', '25 basis points', 'the benchmark rate from 0.25% up to 0.5%', '25 basis points'.",
"Hiking Institution": "The hiking institution is the financial institution with the authority to determine or implement the interest rate hike policy in a Finance/Trading - Interest Rate Increase event, such as central banks from different countries (e.g., Bank of England, Federal Reserve, European Central Bank) or financial institutions (e.g., Bank of England).",
"Time": "Indicates the specific date or time period when the Finance/Trading - Interest Rate Increase event occurred, such as 'the morning of June 18th', 'January 24th', 'three months later', etc. The specific expression includes time accurate to the minute, such as '11:00 on December 28, 2018', relative time, such as 'yesterday (2nd)', and special time expressions like 'Mid-Autumn Festival'."
}
},
"Organizational Relationship - Contract Termination": {
"Organizational Relationship - Contract Termination": "Situations of contract cancellation or termination usually occur in the business, entertainment, or sports domains. Trigger words include 'leave', 'trade', 'cut', 'contract expiry', 'contract termination', 'sell-off', 'release', 'send out', 'contract break', etc. Positive examples include 'Peng Yuchang terminates his contract' and 'Jiang Mengjie nearly bankrupt after contract termination'. Negative examples are like 'Federer withdrew from the competition'.",
"trigger": true,
"arguments": {
"Party Being Terminated": "In an organizational relationship contract termination event, the role is the party whose agreement or contract relation is being dissolved, and might be an individual or an organization, such as an athlete, film producer, company, etc. For instance, 'seven-time All-Star Joe Johnson', 'the production side of 'A Little Wish'', 'Raptors', 'Samsung', etc."
}
}
},
"input": "News from August 20th, according to Tencent News 'Frontline' report, informed sources stated that in order to control cost expenditure, NIO plans to reduce the number of staff at its U.S. branch, excluding those involved in the autonomous driving business, to about 200. As of August 16th, U.S. time, NIO's Silicon Valley branch had cut 100 employees."
}
```
</details>
<details>
<summary><b>Knowledge Graph Construction (KGC) Description Instructions</b></summary>
```json
{
"instruction": "You are an expert in structuring knowledge about graph entities. Based on the schema description for the input entity type, extract the corresponding entity instances and their attribute information from the text; do not output non-existent attributes, return a list for attributes with multiple values, and provide the output in a parseable JSON format.",
"schema": [
{
"entity_type": "Person",
"attributes": {
"Chinese Name": "The Chinese name of the person",
"English Name": "The English name of the person",
"Ancestral Home": "The ancestral address of the person",
"Date of Birth": "Birthday, birth date",
"Place of Birth": "The place of birth, administrative region",
"Occupation": "The occupation, position, identity of the person",
"Alma Mater": "The middle school, university, college from which the person graduated",
"Works": "Albums, songs, novels, published books, participated film and television works, etc.",
"Awards": "Various awards and honors received by the person"
}
}
],
"input": "Jay Chou (Jay Chou), born on January 18, 1979, in New Taipei City, Taiwan Province, with ancestral home in Yongchun County, Quanzhou City, Fujian Province, is a Chinese pop musician, actor, director, and screenwriter. He graduated from Tamkang High School. In 2000, he released his debut music album 'Jay.' In 2001, he cemented his fusion style of Eastern and Western music with the album 'Fantasy.' In 2002, he held 'The One' world tour; that same year, he won the Best Composer award at the 13th Taiwan Golden Melody Awards for the song 'Love Before the Century.'"
}
```
</details>
### Customized Example Instructions
Given that example instances can often be lengthy, and due to the limited maximum length of model training, too many examples may inversely affect model performance. Therefore, we suggest providing 2 examples: one positive and one negative, while keeping the number of schemas to one.
```json
{
"instruction": "You are an expert in entity extraction. Please extract entities from the input that fit the defined schema; return an empty list for non-existent entity types. Please respond in the format of a JSON string. You may refer to the example to guide your extraction.",
"schema": [
"Biomarker"
],
"example": [
{
"input": "Diagnostic criteria for CKD include: 1. Any of the following indicators persisting for more than 3 months; and meeting at least one criterion.(1) Signs of renal damage: Albuminuria [Albumin excretion rate (AER)≥30mg/24h; Albumin to creatinine ratio (ACR)≥3mg/mmol]; abnormal urinary sediment; tubular pathology; histological anomalies; structural abnormities found in imaging; history of kidney transplantation.(2) Decline in glomerular filtration rate: eGFR≤60ml·min-1·1.73m-2",
"output": {
"Biomarker": [
"Albumin excretion rate (AER)",
"Albumin to creatinine ratio (ACR)",
"Glomerular filtration rate",
"eGFR"
]
}
},
{
"input": "Application of DPP-4 inhibitors in specific populations",
"output": {
"Biomarker": []
}
}
],
"input": "Currently, all sulfonylurea drugs' leaflets list severe liver dysfunction as a contraindication. Alanine transaminase (ALT)> 3 times the upper limit of the reference value can serve as a sensitive and specific indicator of liver damage. If ALT>8-10 times the upper limit of the reference value or ALT>3 times with total serum bilirubin (TBIL)>2 times the reference value, it is considered a specific predictor of severe liver damage, indicating substantial injury to hepatic parenchymal cells; sulfonylureas should be contraindicated at this stage. Clinically, patients with decompensated liver cirrhosis accompanied by hepatic encephalopathy, ascites, or coagulation disorders should avoid this class of drugs to prevent hypoglycemia."
}
```
<details>
<summary><b>Relationship Extraction (RE) Example Instruction</b></summary>
```json
{
"instruction": "You are an expert specialized in relationship extraction. Please extract from the input the defined relation triples according to the schema; return an empty list for non-existent relations. Please respond in the format of a JSON string. You may refer to the example for guidance on extraction.",
"schema": [
"Disease Staging and Typing"
],
"example": [
{
"input": "The foundational treatment of diabetes includes both education and management, as well as diet and exercise. A lack of knowledge in diabetes prevention and control is the primary reason for poor blood sugar management. Paying attention to the education and management of elderly patients is an important measure to improve the treatment level of diabetes.",
"output": {
"Disease Staging and Typing": []
}
},
{
"input": "Metabolites of glipizide have no hypoglycemic effect and are mostly excreted through feces, with only 5.0% excreted by the kidneys, thus are less affected by renal function. However, large clinical trials in patients with chronic kidney disease are limited. There have been studies observing the use of glipizide in patients with GFR10~50 ml min-1.(1.73m2)-1, but the trial designs are not perfect. Glipizide can be used in patients with stages 1 to 3 chronic kidney disease without dose adjustment; caution is advised in stage 4; and it is contraindicated in stage 5.",
"output": {
"Disease Staging and Typing": [
{
"subject": "Chronic kidney disease",
"object": "Chronic"
},
{
"subject": "Chronic kidney disease",
"object": "Chronic"
},
{
"subject": "Chronic kidney disease",
"object": "stages 1 to 3"
},
{
"subject": "Chronic kidney disease",
"object": "stage 4"
},
{
"subject": "Chronic kidney disease",
"object": "stage 5"
}
]
}
}
],
"input": "(2)NSAIDs: This includes both non-selective cyclooxygenase (COX) inhibitors and COX-2 inhibitors. If there are no contraindications, early and ample use of fast-acting NSAID formulations is recommended. Non-selective COX inhibitors primarily have gastrointestinal adverse reactions such as ulcers, perforations, and upper gastrointestinal bleeding, hence COX-2 inhibitors, which can reduce GI reactions by 50%, may be used for those intolerant to non-selective COX inhibitors. Active gastrointestinal ulcers/bleeding or a history of recurrent gastrointestinal ulcers/bleeding is a contraindication for all NSAIDs use. COX-2 inhibitors may increase the risk of cardiovascular events and should be avoided in patients with myocardial infarction or heart failure. Kidney function monitoring is required during the use of NSAIDs, and their use is not recommended in patients with severe chronic kidney disease (stages G4 to G5) who are not undergoing dialysis."
}
```
</details>
<details>
<summary><b>Event Extraction (EE) Example Instruction</b></summary>
```json
{
"instruction": "You are an expert specialized in event extraction. Please extract events from the input according to the defined schema; return an empty list for non-existent events, and 'NAN' for non-existent arguments. If an argument has multiple values, please return a list. Respond in the format of a JSON string. You may refer to the example for extraction guidance.",
"schema": [
{
"event_type": "Corporate Financing",
"trigger": true,
"arguments": [
"Disclosure Time",
"Investee",
"Financing Round",
"Lead Investor",
"Event Time",
"Investor",
"Financing Amount"
]
}
],
"example": [
{
"input": "Raise 2.5 billion yuan for expansion due to the 'three highs' condition of Joyson Electronics: high pledges, high goodwill, high debt\nReporter Zhang Jiazhen, from Beijing\nNingbo Joyson Electronic Corporation (hereinafter referred to as 'Joyson Electronics', 600699.SH), which holds billion-level big orders, is actively raising funds to expand production capacity to ease the increasingly pressing bottleneck of production capacity saturation.\nRecently, Joyson Electronics announced that it has received the 'Feedback Notice' from the China Securities Regulatory Commission, and its private stock offering is a step closer to approval.",
"output": {
"Corporate Financing": [
{
"trigger": "Raise",
"arguments": {
"Disclosure Time": "NAN",
"Investee": "Ningbo Joyson Electronic Corporation",
"Financing Round": "NAN",
"Lead Investor": "NAN",
"Event Time": "NAN",
"Investor": "NAN",
"Financing Amount": "2.5 billion yuan"
}
}
]
}
},
{
"input": "NIO stock falls to 13% before market; NIO reports over 3.2 billion loss in Q2\nOriginal Title: NIO stock falls to 13% before market; NIO reports over 3.2 billion loss in Q2\nNIO's stock price turned from a rise to a fall before market, falling to 13%. NIO released its Q2 earnings today, followed by the announcement of the cancellation of the earnings conference call originally scheduled for today.\nThe earnings report showed that NIO achieved a revenue of 1.508 billion yuan in the second quarter, exceeding market expectations of 1.309 billion yuan, compared to 46 million yuan in the same period last year; The net loss attributable to shareholders in the second quarter was 3.285 billion yuan, higher than the market expected loss of 2.944 billion yuan, compared to a loss of 6.11 billion yuan in the same period last year.",
"output": {
"Corporate Financing": []
}
}
],
"input": "【Exclusive】The 11th in five years, Codemao announces completion of C+ round financing of 250 million yuan\nJiemodui, April 17th - Today, Codemao announced the completion of a C+ round of financing worth 250 million yuan.\nThis comes five months after completing a C round financing of 400 million yuan last year, which is the new round of 'ammunition' added by Codemao.\nThe round was led by China Merchants International, with Bohai Capital, an equity investment fund under Bank of China Group, and existing shareholders Yueke Xintai and Shengyu Investment following suit."
}
```
</details>
## Evaluation
To extract structured content from the output text and to assess it, please refer to [DeepKE-llm/InstructKGC/README_CN.md/7.评估](https://github.com/zjunlp/DeepKE/blob/main/example/llm/InstructKGC/README_CN.md/#-7%E8%AF%84%E4%BC%B0).
## Continue Training
To continue training OneKE, refer to [DeepKE-llm/InstructKGC/4.9领域内数据继续训练](https://github.com/zjunlp/DeepKE/blob/main/example/llm/InstructKGC/README_CN.md/#49%E9%A2%86%E5%9F%9F%E5%86%85%E6%95%B0%E6%8D%AE%E7%BB%A7%E7%BB%AD%E8%AE%AD%E7%BB%83).
## Citation
If you have used OneKE in your work, please kindly cite the following paper:
```bibtex
@article{DBLP:journals/corr/abs-2402-14710,
author = {Honghao Gui and
Lin Yuan and
Hongbin Ye and
Ningyu Zhang and
Mengshu Sun and
Lei Liang and
Huajun Chen},
title = {IEPile: Unearthing Large-Scale Schema-Based Information Extraction
Corpus},
journal = {CoRR},
volume = {abs/2402.14710},
year = {2024},
url = {https://doi.org/10.48550/arXiv.2402.14710},
doi = {10.48550/ARXIV.2402.14710},
eprinttype = {arXiv},
eprint = {2402.14710},
timestamp = {Tue, 09 Apr 2024 07:32:43 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2402-14710.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` | {"datasets": ["zjunlp/iepile", "zjunlp/InstructIE"], "language": ["en", "zh"], "license": "cc-by-nc-sa-4.0"} |
w601sxs/b1ade-embed-kd | w601sxs | sentence-similarity | [
"sentence-transformers",
"safetensors",
"bert",
"mteb",
"sentence-similarity",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
]
| 2024-05-24T20:58:10 | 2024-05-28T18:31:24 | 276 | 1 | ---
library_name: sentence-transformers
pipeline_tag: sentence-similarity
tags:
- mteb
model-index:
- name: b1ade_embed_kd
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification
type: mteb/amazon_counterfactual
config: default
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 75.81709145427287
- type: ap
value: 23.581082591688467
- type: f1
value: 62.54637626017967
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 80.300125
- type: ap
value: 74.26836190039964
- type: f1
value: 80.2158066692679
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification
type: mteb/amazon_reviews_multi
config: default
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 43.084
- type: f1
value: 42.66774553366831
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: mteb/arguana
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 29.232000000000003
- type: map_at_10
value: 45.777
- type: map_at_100
value: 46.634
- type: map_at_1000
value: 46.64
- type: map_at_20
value: 46.489000000000004
- type: map_at_3
value: 40.861
- type: map_at_5
value: 43.659
- type: mrr_at_1
value: 30.156
- type: mrr_at_10
value: 46.141
- type: mrr_at_100
value: 46.983999999999995
- type: mrr_at_1000
value: 46.989999999999995
- type: mrr_at_20
value: 46.839
- type: mrr_at_3
value: 41.157
- type: mrr_at_5
value: 44.013000000000005
- type: ndcg_at_1
value: 29.232000000000003
- type: ndcg_at_10
value: 54.832
- type: ndcg_at_100
value: 58.303000000000004
- type: ndcg_at_1000
value: 58.451
- type: ndcg_at_20
value: 57.328
- type: ndcg_at_3
value: 44.685
- type: ndcg_at_5
value: 49.756
- type: precision_at_1
value: 29.232000000000003
- type: precision_at_10
value: 8.371
- type: precision_at_100
value: 0.985
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 4.6690000000000005
- type: precision_at_3
value: 18.587
- type: precision_at_5
value: 13.627
- type: recall_at_1
value: 29.232000000000003
- type: recall_at_10
value: 83.71300000000001
- type: recall_at_100
value: 98.506
- type: recall_at_1000
value: 99.644
- type: recall_at_20
value: 93.38499999999999
- type: recall_at_3
value: 55.761
- type: recall_at_5
value: 68.137
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 45.801946024895756
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 37.639210206045206
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 57.589359041891576
- type: mrr
value: 70.88334872268389
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 86.63594177060354
- type: cos_sim_spearman
value: 84.75132870687939
- type: euclidean_pearson
value: 85.05646621990854
- type: euclidean_spearman
value: 84.68686940680522
- type: manhattan_pearson
value: 85.08705700579426
- type: manhattan_spearman
value: 84.83446313127413
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 85.1948051948052
- type: f1
value: 85.13229898343104
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 38.68616898014911
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 34.45376891835619
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: mteb/cqadupstack-android
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 26.340000000000003
- type: map_at_10
value: 36.513
- type: map_at_100
value: 37.968
- type: map_at_1000
value: 38.107
- type: map_at_20
value: 37.355
- type: map_at_3
value: 33.153
- type: map_at_5
value: 34.899
- type: mrr_at_1
value: 33.763
- type: mrr_at_10
value: 42.778
- type: mrr_at_100
value: 43.667
- type: mrr_at_1000
value: 43.724000000000004
- type: mrr_at_20
value: 43.349
- type: mrr_at_3
value: 40.32
- type: mrr_at_5
value: 41.657
- type: ndcg_at_1
value: 33.763
- type: ndcg_at_10
value: 42.783
- type: ndcg_at_100
value: 48.209999999999994
- type: ndcg_at_1000
value: 50.678999999999995
- type: ndcg_at_20
value: 45.073
- type: ndcg_at_3
value: 37.841
- type: ndcg_at_5
value: 39.818999999999996
- type: precision_at_1
value: 33.763
- type: precision_at_10
value: 8.398
- type: precision_at_100
value: 1.396
- type: precision_at_1000
value: 0.188
- type: precision_at_20
value: 5.0569999999999995
- type: precision_at_3
value: 18.503
- type: precision_at_5
value: 13.219
- type: recall_at_1
value: 26.340000000000003
- type: recall_at_10
value: 54.603
- type: recall_at_100
value: 77.264
- type: recall_at_1000
value: 93.882
- type: recall_at_20
value: 63.101
- type: recall_at_3
value: 39.6
- type: recall_at_5
value: 45.651
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: mteb/cqadupstack-english
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 24.313000000000002
- type: map_at_10
value: 33.225
- type: map_at_100
value: 34.293
- type: map_at_1000
value: 34.421
- type: map_at_20
value: 33.818
- type: map_at_3
value: 30.545
- type: map_at_5
value: 32.144
- type: mrr_at_1
value: 31.083
- type: mrr_at_10
value: 39.199
- type: mrr_at_100
value: 39.835
- type: mrr_at_1000
value: 39.892
- type: mrr_at_20
value: 39.57
- type: mrr_at_3
value: 36.879
- type: mrr_at_5
value: 38.245000000000005
- type: ndcg_at_1
value: 31.083
- type: ndcg_at_10
value: 38.553
- type: ndcg_at_100
value: 42.685
- type: ndcg_at_1000
value: 45.144
- type: ndcg_at_20
value: 40.116
- type: ndcg_at_3
value: 34.608
- type: ndcg_at_5
value: 36.551
- type: precision_at_1
value: 31.083
- type: precision_at_10
value: 7.28
- type: precision_at_100
value: 1.183
- type: precision_at_1000
value: 0.168
- type: precision_at_20
value: 4.322
- type: precision_at_3
value: 16.858
- type: precision_at_5
value: 12.127
- type: recall_at_1
value: 24.313000000000002
- type: recall_at_10
value: 48.117
- type: recall_at_100
value: 65.768
- type: recall_at_1000
value: 81.935
- type: recall_at_20
value: 53.689
- type: recall_at_3
value: 36.335
- type: recall_at_5
value: 41.803000000000004
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: mteb/cqadupstack-gaming
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 33.013999999999996
- type: map_at_10
value: 44.567
- type: map_at_100
value: 45.664
- type: map_at_1000
value: 45.732
- type: map_at_20
value: 45.190000000000005
- type: map_at_3
value: 41.393
- type: map_at_5
value: 43.147000000000006
- type: mrr_at_1
value: 37.806
- type: mrr_at_10
value: 47.841
- type: mrr_at_100
value: 48.597
- type: mrr_at_1000
value: 48.638
- type: mrr_at_20
value: 48.262
- type: mrr_at_3
value: 45.361000000000004
- type: mrr_at_5
value: 46.803
- type: ndcg_at_1
value: 37.806
- type: ndcg_at_10
value: 50.412
- type: ndcg_at_100
value: 55.019
- type: ndcg_at_1000
value: 56.52
- type: ndcg_at_20
value: 52.23100000000001
- type: ndcg_at_3
value: 44.944
- type: ndcg_at_5
value: 47.535
- type: precision_at_1
value: 37.806
- type: precision_at_10
value: 8.351
- type: precision_at_100
value: 1.163
- type: precision_at_1000
value: 0.134
- type: precision_at_20
value: 4.727
- type: precision_at_3
value: 20.376
- type: precision_at_5
value: 14.056
- type: recall_at_1
value: 33.013999999999996
- type: recall_at_10
value: 64.35600000000001
- type: recall_at_100
value: 84.748
- type: recall_at_1000
value: 95.525
- type: recall_at_20
value: 71.137
- type: recall_at_3
value: 49.726
- type: recall_at_5
value: 56.054
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: mteb/cqadupstack-gis
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 18.476
- type: map_at_10
value: 24.715999999999998
- type: map_at_100
value: 25.72
- type: map_at_1000
value: 25.826999999999998
- type: map_at_20
value: 25.276
- type: map_at_3
value: 22.656000000000002
- type: map_at_5
value: 23.737
- type: mrr_at_1
value: 20.113
- type: mrr_at_10
value: 26.423999999999996
- type: mrr_at_100
value: 27.328000000000003
- type: mrr_at_1000
value: 27.418
- type: mrr_at_20
value: 26.936
- type: mrr_at_3
value: 24.275
- type: mrr_at_5
value: 25.501
- type: ndcg_at_1
value: 20.113
- type: ndcg_at_10
value: 28.626
- type: ndcg_at_100
value: 33.649
- type: ndcg_at_1000
value: 36.472
- type: ndcg_at_20
value: 30.581999999999997
- type: ndcg_at_3
value: 24.490000000000002
- type: ndcg_at_5
value: 26.394000000000002
- type: precision_at_1
value: 20.113
- type: precision_at_10
value: 4.52
- type: precision_at_100
value: 0.739
- type: precision_at_1000
value: 0.10200000000000001
- type: precision_at_20
value: 2.706
- type: precision_at_3
value: 10.433
- type: precision_at_5
value: 7.48
- type: recall_at_1
value: 18.476
- type: recall_at_10
value: 39.129000000000005
- type: recall_at_100
value: 62.44
- type: recall_at_1000
value: 83.95700000000001
- type: recall_at_20
value: 46.611999999999995
- type: recall_at_3
value: 27.772000000000002
- type: recall_at_5
value: 32.312000000000005
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: mteb/cqadupstack-mathematica
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 10.126
- type: map_at_10
value: 15.916
- type: map_at_100
value: 17.049
- type: map_at_1000
value: 17.19
- type: map_at_20
value: 16.569
- type: map_at_3
value: 13.986
- type: map_at_5
value: 15.052999999999999
- type: mrr_at_1
value: 13.059999999999999
- type: mrr_at_10
value: 19.52
- type: mrr_at_100
value: 20.599999999999998
- type: mrr_at_1000
value: 20.693
- type: mrr_at_20
value: 20.177999999999997
- type: mrr_at_3
value: 17.496000000000002
- type: mrr_at_5
value: 18.541
- type: ndcg_at_1
value: 13.059999999999999
- type: ndcg_at_10
value: 19.987
- type: ndcg_at_100
value: 25.602000000000004
- type: ndcg_at_1000
value: 29.171999999999997
- type: ndcg_at_20
value: 22.31
- type: ndcg_at_3
value: 16.286
- type: ndcg_at_5
value: 17.931
- type: precision_at_1
value: 13.059999999999999
- type: precision_at_10
value: 3.9050000000000002
- type: precision_at_100
value: 0.771
- type: precision_at_1000
value: 0.123
- type: precision_at_20
value: 2.606
- type: precision_at_3
value: 8.167
- type: precision_at_5
value: 6.045
- type: recall_at_1
value: 10.126
- type: recall_at_10
value: 29.137
- type: recall_at_100
value: 53.824000000000005
- type: recall_at_1000
value: 79.373
- type: recall_at_20
value: 37.475
- type: recall_at_3
value: 18.791
- type: recall_at_5
value: 22.993
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: mteb/cqadupstack-physics
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 25.281
- type: map_at_10
value: 34.875
- type: map_at_100
value: 36.268
- type: map_at_1000
value: 36.385
- type: map_at_20
value: 35.711999999999996
- type: map_at_3
value: 31.808999999999997
- type: map_at_5
value: 33.550999999999995
- type: mrr_at_1
value: 31.28
- type: mrr_at_10
value: 40.489000000000004
- type: mrr_at_100
value: 41.434
- type: mrr_at_1000
value: 41.491
- type: mrr_at_20
value: 41.088
- type: mrr_at_3
value: 38.033
- type: mrr_at_5
value: 39.621
- type: ndcg_at_1
value: 31.28
- type: ndcg_at_10
value: 40.716
- type: ndcg_at_100
value: 46.45
- type: ndcg_at_1000
value: 48.851
- type: ndcg_at_20
value: 43.216
- type: ndcg_at_3
value: 35.845
- type: ndcg_at_5
value: 38.251000000000005
- type: precision_at_1
value: 31.28
- type: precision_at_10
value: 7.623
- type: precision_at_100
value: 1.214
- type: precision_at_1000
value: 0.159
- type: precision_at_20
value: 4.625
- type: precision_at_3
value: 17.26
- type: precision_at_5
value: 12.435
- type: recall_at_1
value: 25.281
- type: recall_at_10
value: 52.476
- type: recall_at_100
value: 76.535
- type: recall_at_1000
value: 92.658
- type: recall_at_20
value: 61.211000000000006
- type: recall_at_3
value: 38.805
- type: recall_at_5
value: 45.053
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: mteb/cqadupstack-programmers
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 20.092
- type: map_at_10
value: 27.805999999999997
- type: map_at_100
value: 29.137999999999998
- type: map_at_1000
value: 29.266
- type: map_at_20
value: 28.587
- type: map_at_3
value: 25.112000000000002
- type: map_at_5
value: 26.551000000000002
- type: mrr_at_1
value: 24.315
- type: mrr_at_10
value: 32.068000000000005
- type: mrr_at_100
value: 33.039
- type: mrr_at_1000
value: 33.114
- type: mrr_at_20
value: 32.66
- type: mrr_at_3
value: 29.49
- type: mrr_at_5
value: 30.906
- type: ndcg_at_1
value: 24.315
- type: ndcg_at_10
value: 32.9
- type: ndcg_at_100
value: 38.741
- type: ndcg_at_1000
value: 41.657
- type: ndcg_at_20
value: 35.338
- type: ndcg_at_3
value: 28.069
- type: ndcg_at_5
value: 30.169
- type: precision_at_1
value: 24.315
- type: precision_at_10
value: 6.2330000000000005
- type: precision_at_100
value: 1.072
- type: precision_at_1000
value: 0.15
- type: precision_at_20
value: 3.8580000000000005
- type: precision_at_3
value: 13.318
- type: precision_at_5
value: 9.748999999999999
- type: recall_at_1
value: 20.092
- type: recall_at_10
value: 43.832
- type: recall_at_100
value: 68.75099999999999
- type: recall_at_1000
value: 89.25
- type: recall_at_20
value: 52.445
- type: recall_at_3
value: 30.666
- type: recall_at_5
value: 35.873
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: mteb/cqadupstack
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 19.317
- type: map_at_10
value: 26.653
- type: map_at_100
value: 28.011999999999997
- type: map_at_1000
value: 28.231
- type: map_at_20
value: 27.301
- type: map_at_3
value: 23.763
- type: map_at_5
value: 25.391000000000002
- type: mrr_at_1
value: 24.506
- type: mrr_at_10
value: 31.991999999999997
- type: mrr_at_100
value: 32.924
- type: mrr_at_1000
value: 32.993
- type: mrr_at_20
value: 32.521
- type: mrr_at_3
value: 29.48
- type: mrr_at_5
value: 30.982
- type: ndcg_at_1
value: 24.506
- type: ndcg_at_10
value: 32.202999999999996
- type: ndcg_at_100
value: 37.797
- type: ndcg_at_1000
value: 40.859
- type: ndcg_at_20
value: 34.098
- type: ndcg_at_3
value: 27.552
- type: ndcg_at_5
value: 29.781000000000002
- type: precision_at_1
value: 24.506
- type: precision_at_10
value: 6.462
- type: precision_at_100
value: 1.35
- type: precision_at_1000
value: 0.22499999999999998
- type: precision_at_20
value: 4.071000000000001
- type: precision_at_3
value: 13.241
- type: precision_at_5
value: 9.921000000000001
- type: recall_at_1
value: 19.317
- type: recall_at_10
value: 42.296
- type: recall_at_100
value: 68.2
- type: recall_at_1000
value: 88.565
- type: recall_at_20
value: 49.883
- type: recall_at_3
value: 28.608
- type: recall_at_5
value: 34.854
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: mteb/cqadupstack-stats
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 18.0
- type: map_at_10
value: 24.444
- type: map_at_100
value: 25.205
- type: map_at_1000
value: 25.291000000000004
- type: map_at_20
value: 24.834
- type: map_at_3
value: 22.311
- type: map_at_5
value: 23.442
- type: mrr_at_1
value: 20.552
- type: mrr_at_10
value: 27.028999999999996
- type: mrr_at_100
value: 27.706999999999997
- type: mrr_at_1000
value: 27.775
- type: mrr_at_20
value: 27.366
- type: mrr_at_3
value: 25.051000000000002
- type: mrr_at_5
value: 26.063
- type: ndcg_at_1
value: 20.552
- type: ndcg_at_10
value: 28.519
- type: ndcg_at_100
value: 32.580999999999996
- type: ndcg_at_1000
value: 34.99
- type: ndcg_at_20
value: 29.848000000000003
- type: ndcg_at_3
value: 24.46
- type: ndcg_at_5
value: 26.273000000000003
- type: precision_at_1
value: 20.552
- type: precision_at_10
value: 4.801
- type: precision_at_100
value: 0.729
- type: precision_at_1000
value: 0.101
- type: precision_at_20
value: 2.715
- type: precision_at_3
value: 10.940999999999999
- type: precision_at_5
value: 7.761
- type: recall_at_1
value: 18.0
- type: recall_at_10
value: 38.425
- type: recall_at_100
value: 57.885
- type: recall_at_1000
value: 75.945
- type: recall_at_20
value: 43.472
- type: recall_at_3
value: 27.483
- type: recall_at_5
value: 31.866
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: mteb/cqadupstack-tex
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 10.014000000000001
- type: map_at_10
value: 14.462
- type: map_at_100
value: 15.364
- type: map_at_1000
value: 15.482999999999999
- type: map_at_20
value: 14.931
- type: map_at_3
value: 12.842
- type: map_at_5
value: 13.697999999999999
- type: mrr_at_1
value: 12.526000000000002
- type: mrr_at_10
value: 17.433
- type: mrr_at_100
value: 18.296
- type: mrr_at_1000
value: 18.383
- type: mrr_at_20
value: 17.897
- type: mrr_at_3
value: 15.703
- type: mrr_at_5
value: 16.627
- type: ndcg_at_1
value: 12.526000000000002
- type: ndcg_at_10
value: 17.697
- type: ndcg_at_100
value: 22.33
- type: ndcg_at_1000
value: 25.587
- type: ndcg_at_20
value: 19.302
- type: ndcg_at_3
value: 14.606
- type: ndcg_at_5
value: 15.946
- type: precision_at_1
value: 12.526000000000002
- type: precision_at_10
value: 3.383
- type: precision_at_100
value: 0.6799999999999999
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_20
value: 2.147
- type: precision_at_3
value: 7.02
- type: precision_at_5
value: 5.196
- type: recall_at_1
value: 10.014000000000001
- type: recall_at_10
value: 24.623
- type: recall_at_100
value: 45.795
- type: recall_at_1000
value: 69.904
- type: recall_at_20
value: 30.534
- type: recall_at_3
value: 15.955
- type: recall_at_5
value: 19.394
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: mteb/cqadupstack-unix
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 19.156000000000002
- type: map_at_10
value: 26.144000000000002
- type: map_at_100
value: 27.157999999999998
- type: map_at_1000
value: 27.288
- type: map_at_20
value: 26.689
- type: map_at_3
value: 24.125
- type: map_at_5
value: 25.369000000000003
- type: mrr_at_1
value: 22.854
- type: mrr_at_10
value: 29.874000000000002
- type: mrr_at_100
value: 30.738
- type: mrr_at_1000
value: 30.826999999999998
- type: mrr_at_20
value: 30.354
- type: mrr_at_3
value: 27.689999999999998
- type: mrr_at_5
value: 29.131
- type: ndcg_at_1
value: 22.854
- type: ndcg_at_10
value: 30.469
- type: ndcg_at_100
value: 35.475
- type: ndcg_at_1000
value: 38.59
- type: ndcg_at_20
value: 32.333
- type: ndcg_at_3
value: 26.674999999999997
- type: ndcg_at_5
value: 28.707
- type: precision_at_1
value: 22.854
- type: precision_at_10
value: 5.1209999999999996
- type: precision_at_100
value: 0.8500000000000001
- type: precision_at_1000
value: 0.123
- type: precision_at_20
value: 3.0460000000000003
- type: precision_at_3
value: 12.127
- type: precision_at_5
value: 8.75
- type: recall_at_1
value: 19.156000000000002
- type: recall_at_10
value: 40.009
- type: recall_at_100
value: 62.419999999999995
- type: recall_at_1000
value: 84.585
- type: recall_at_20
value: 46.912
- type: recall_at_3
value: 29.733999999999998
- type: recall_at_5
value: 34.741
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: mteb/cqadupstack-webmasters
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 19.317
- type: map_at_10
value: 26.653
- type: map_at_100
value: 28.011999999999997
- type: map_at_1000
value: 28.231
- type: map_at_20
value: 27.301
- type: map_at_3
value: 23.763
- type: map_at_5
value: 25.391000000000002
- type: mrr_at_1
value: 24.506
- type: mrr_at_10
value: 31.991999999999997
- type: mrr_at_100
value: 32.924
- type: mrr_at_1000
value: 32.993
- type: mrr_at_20
value: 32.521
- type: mrr_at_3
value: 29.48
- type: mrr_at_5
value: 30.982
- type: ndcg_at_1
value: 24.506
- type: ndcg_at_10
value: 32.202999999999996
- type: ndcg_at_100
value: 37.797
- type: ndcg_at_1000
value: 40.859
- type: ndcg_at_20
value: 34.098
- type: ndcg_at_3
value: 27.552
- type: ndcg_at_5
value: 29.781000000000002
- type: precision_at_1
value: 24.506
- type: precision_at_10
value: 6.462
- type: precision_at_100
value: 1.35
- type: precision_at_1000
value: 0.22499999999999998
- type: precision_at_20
value: 4.071000000000001
- type: precision_at_3
value: 13.241
- type: precision_at_5
value: 9.921000000000001
- type: recall_at_1
value: 19.317
- type: recall_at_10
value: 42.296
- type: recall_at_100
value: 68.2
- type: recall_at_1000
value: 88.565
- type: recall_at_20
value: 49.883
- type: recall_at_3
value: 28.608
- type: recall_at_5
value: 34.854
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval
type: mteb/cqadupstack-wordpress
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 12.822
- type: map_at_10
value: 18.055
- type: map_at_100
value: 18.942
- type: map_at_1000
value: 19.057
- type: map_at_20
value: 18.544
- type: map_at_3
value: 15.964
- type: map_at_5
value: 16.833000000000002
- type: mrr_at_1
value: 14.048
- type: mrr_at_10
value: 19.489
- type: mrr_at_100
value: 20.392
- type: mrr_at_1000
value: 20.49
- type: mrr_at_20
value: 19.979
- type: mrr_at_3
value: 17.344
- type: mrr_at_5
value: 18.287
- type: ndcg_at_1
value: 14.048
- type: ndcg_at_10
value: 21.737000000000002
- type: ndcg_at_100
value: 26.383000000000003
- type: ndcg_at_1000
value: 29.555
- type: ndcg_at_20
value: 23.463
- type: ndcg_at_3
value: 17.29
- type: ndcg_at_5
value: 18.829
- type: precision_at_1
value: 14.048
- type: precision_at_10
value: 3.6229999999999998
- type: precision_at_100
value: 0.641
- type: precision_at_1000
value: 0.099
- type: precision_at_20
value: 2.1999999999999997
- type: precision_at_3
value: 7.2090000000000005
- type: precision_at_5
value: 5.213
- type: recall_at_1
value: 12.822
- type: recall_at_10
value: 32.123000000000005
- type: recall_at_100
value: 53.657999999999994
- type: recall_at_1000
value: 77.72200000000001
- type: recall_at_20
value: 38.66
- type: recall_at_3
value: 19.814999999999998
- type: recall_at_5
value: 23.432
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: mteb/climate-fever
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 13.119
- type: map_at_10
value: 22.999
- type: map_at_100
value: 25.108000000000004
- type: map_at_1000
value: 25.306
- type: map_at_20
value: 24.141000000000002
- type: map_at_3
value: 19.223000000000003
- type: map_at_5
value: 21.181
- type: mrr_at_1
value: 30.554
- type: mrr_at_10
value: 42.553000000000004
- type: mrr_at_100
value: 43.498
- type: mrr_at_1000
value: 43.527
- type: mrr_at_20
value: 43.193
- type: mrr_at_3
value: 39.283
- type: mrr_at_5
value: 41.143
- type: ndcg_at_1
value: 30.554
- type: ndcg_at_10
value: 31.946
- type: ndcg_at_100
value: 39.934999999999995
- type: ndcg_at_1000
value: 43.256
- type: ndcg_at_20
value: 35.101
- type: ndcg_at_3
value: 26.489
- type: ndcg_at_5
value: 28.272000000000002
- type: precision_at_1
value: 30.554
- type: precision_at_10
value: 10.039
- type: precision_at_100
value: 1.864
- type: precision_at_1000
value: 0.248
- type: precision_at_20
value: 6.371
- type: precision_at_3
value: 20.174
- type: precision_at_5
value: 15.296000000000001
- type: recall_at_1
value: 13.119
- type: recall_at_10
value: 37.822
- type: recall_at_100
value: 65.312
- type: recall_at_1000
value: 83.817
- type: recall_at_20
value: 46.760000000000005
- type: recall_at_3
value: 23.858999999999998
- type: recall_at_5
value: 29.609999999999996
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: mteb/dbpedia
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 8.176
- type: map_at_10
value: 19.594
- type: map_at_100
value: 28.081
- type: map_at_1000
value: 29.864
- type: map_at_20
value: 22.983999999999998
- type: map_at_3
value: 13.923
- type: map_at_5
value: 16.597
- type: mrr_at_1
value: 66.75
- type: mrr_at_10
value: 75.82600000000001
- type: mrr_at_100
value: 76.145
- type: mrr_at_1000
value: 76.14999999999999
- type: mrr_at_20
value: 76.074
- type: mrr_at_3
value: 74.333
- type: mrr_at_5
value: 75.25800000000001
- type: ndcg_at_1
value: 54.50000000000001
- type: ndcg_at_10
value: 41.806
- type: ndcg_at_100
value: 47.067
- type: ndcg_at_1000
value: 54.397
- type: ndcg_at_20
value: 41.727
- type: ndcg_at_3
value: 46.92
- type: ndcg_at_5
value: 44.381
- type: precision_at_1
value: 66.75
- type: precision_at_10
value: 33.35
- type: precision_at_100
value: 10.92
- type: precision_at_1000
value: 2.222
- type: precision_at_20
value: 25.862000000000002
- type: precision_at_3
value: 51.417
- type: precision_at_5
value: 43.65
- type: recall_at_1
value: 8.176
- type: recall_at_10
value: 26.029000000000003
- type: recall_at_100
value: 53.872
- type: recall_at_1000
value: 76.895
- type: recall_at_20
value: 34.192
- type: recall_at_3
value: 15.789
- type: recall_at_5
value: 20.255000000000003
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 48.22
- type: f1
value: 43.59074485488622
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: mteb/fever
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 40.872
- type: map_at_10
value: 55.178000000000004
- type: map_at_100
value: 55.859
- type: map_at_1000
value: 55.881
- type: map_at_20
value: 55.66
- type: map_at_3
value: 51.4
- type: map_at_5
value: 53.754000000000005
- type: mrr_at_1
value: 43.744
- type: mrr_at_10
value: 58.36900000000001
- type: mrr_at_100
value: 58.911
- type: mrr_at_1000
value: 58.916999999999994
- type: mrr_at_20
value: 58.779
- type: mrr_at_3
value: 54.653
- type: mrr_at_5
value: 56.987
- type: ndcg_at_1
value: 43.744
- type: ndcg_at_10
value: 62.936
- type: ndcg_at_100
value: 65.666
- type: ndcg_at_1000
value: 66.08699999999999
- type: ndcg_at_20
value: 64.548
- type: ndcg_at_3
value: 55.543
- type: ndcg_at_5
value: 59.646
- type: precision_at_1
value: 43.744
- type: precision_at_10
value: 9.191
- type: precision_at_100
value: 1.072
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_20
value: 4.967
- type: precision_at_3
value: 23.157
- type: precision_at_5
value: 16.115
- type: recall_at_1
value: 40.872
- type: recall_at_10
value: 83.818
- type: recall_at_100
value: 95.14200000000001
- type: recall_at_1000
value: 97.897
- type: recall_at_20
value: 89.864
- type: recall_at_3
value: 64.19200000000001
- type: recall_at_5
value: 74.029
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: mteb/fiqa
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 14.804999999999998
- type: map_at_10
value: 22.86
- type: map_at_100
value: 24.823999999999998
- type: map_at_1000
value: 25.041000000000004
- type: map_at_20
value: 23.881
- type: map_at_3
value: 20.09
- type: map_at_5
value: 21.39
- type: mrr_at_1
value: 29.938
- type: mrr_at_10
value: 37.041000000000004
- type: mrr_at_100
value: 38.196000000000005
- type: mrr_at_1000
value: 38.256
- type: mrr_at_20
value: 37.693
- type: mrr_at_3
value: 34.721999999999994
- type: mrr_at_5
value: 35.787
- type: ndcg_at_1
value: 29.938
- type: ndcg_at_10
value: 29.358
- type: ndcg_at_100
value: 37.544
- type: ndcg_at_1000
value: 41.499
- type: ndcg_at_20
value: 32.354
- type: ndcg_at_3
value: 26.434
- type: ndcg_at_5
value: 26.93
- type: precision_at_1
value: 29.938
- type: precision_at_10
value: 8.117
- type: precision_at_100
value: 1.611
- type: precision_at_1000
value: 0.232
- type: precision_at_20
value: 5.255
- type: precision_at_3
value: 17.49
- type: precision_at_5
value: 12.747
- type: recall_at_1
value: 14.804999999999998
- type: recall_at_10
value: 34.776
- type: recall_at_100
value: 66.279
- type: recall_at_1000
value: 89.96600000000001
- type: recall_at_20
value: 44.31
- type: recall_at_3
value: 23.623
- type: recall_at_5
value: 27.194000000000003
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: mteb/hotpotqa
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 38.555
- type: map_at_10
value: 54.20700000000001
- type: map_at_100
value: 55.177
- type: map_at_1000
value: 55.254999999999995
- type: map_at_20
value: 54.788000000000004
- type: map_at_3
value: 51.034
- type: map_at_5
value: 52.998
- type: mrr_at_1
value: 77.11
- type: mrr_at_10
value: 82.93199999999999
- type: mrr_at_100
value: 83.14200000000001
- type: mrr_at_1000
value: 83.15
- type: mrr_at_20
value: 83.062
- type: mrr_at_3
value: 81.95599999999999
- type: mrr_at_5
value: 82.586
- type: ndcg_at_1
value: 77.11
- type: ndcg_at_10
value: 63.853
- type: ndcg_at_100
value: 67.18499999999999
- type: ndcg_at_1000
value: 68.676
- type: ndcg_at_20
value: 65.279
- type: ndcg_at_3
value: 59.301
- type: ndcg_at_5
value: 61.822
- type: precision_at_1
value: 77.11
- type: precision_at_10
value: 13.044
- type: precision_at_100
value: 1.5630000000000002
- type: precision_at_1000
value: 0.17600000000000002
- type: precision_at_20
value: 6.979
- type: precision_at_3
value: 36.759
- type: precision_at_5
value: 24.054000000000002
- type: recall_at_1
value: 38.555
- type: recall_at_10
value: 65.21900000000001
- type: recall_at_100
value: 78.16300000000001
- type: recall_at_1000
value: 88.02799999999999
- type: recall_at_20
value: 69.791
- type: recall_at_3
value: 55.138
- type: recall_at_5
value: 60.135000000000005
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 69.8728
- type: ap
value: 63.98214492125858
- type: f1
value: 69.59975497754624
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification
type: mteb/mtop_domain
config: default
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 94.76288189694483
- type: f1
value: 94.52150972672682
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification
type: mteb/mtop_intent
config: default
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 76.83994528043777
- type: f1
value: 57.95571154189732
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification
type: mteb/amazon_massive_intent
config: default
split: test
revision: 4672e20407010da34463acc759c162ca9734bca6
metrics:
- type: accuracy
value: 46.1163416274378
- type: f1
value: 45.425692244093064
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification
type: mteb/amazon_massive_scenario
config: default
split: test
revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8
metrics:
- type: accuracy
value: 45.57834566240753
- type: f1
value: 43.84840097785479
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 32.86396397182615
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 34.018965727588565
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 59042f120c80e8afa9cdbb224f67076cec0fc9a7
metrics:
- type: map
value: 31.286618059824573
- type: mrr
value: 32.481830769278965
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: mteb/nfcorpus
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 4.236
- type: map_at_10
value: 9.352
- type: map_at_100
value: 12.382
- type: map_at_1000
value: 13.828999999999999
- type: map_at_20
value: 10.619
- type: map_at_3
value: 6.814000000000001
- type: map_at_5
value: 7.887
- type: mrr_at_1
value: 37.152
- type: mrr_at_10
value: 47.055
- type: mrr_at_100
value: 47.82
- type: mrr_at_1000
value: 47.86
- type: mrr_at_20
value: 47.605
- type: mrr_at_3
value: 44.118
- type: mrr_at_5
value: 46.115
- type: ndcg_at_1
value: 34.365
- type: ndcg_at_10
value: 28.473
- type: ndcg_at_100
value: 27.311999999999998
- type: ndcg_at_1000
value: 36.671
- type: ndcg_at_20
value: 27.137
- type: ndcg_at_3
value: 31.939
- type: ndcg_at_5
value: 30.428
- type: precision_at_1
value: 36.223
- type: precision_at_10
value: 21.858
- type: precision_at_100
value: 7.417999999999999
- type: precision_at_1000
value: 2.0709999999999997
- type: precision_at_20
value: 16.502
- type: precision_at_3
value: 30.857
- type: precision_at_5
value: 26.997
- type: recall_at_1
value: 4.236
- type: recall_at_10
value: 13.489
- type: recall_at_100
value: 29.580000000000002
- type: recall_at_1000
value: 62.726000000000006
- type: recall_at_20
value: 18.346999999999998
- type: recall_at_3
value: 7.811
- type: recall_at_5
value: 10.086
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: mteb/nq
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 21.123
- type: map_at_10
value: 34.429
- type: map_at_100
value: 35.803000000000004
- type: map_at_1000
value: 35.853
- type: map_at_20
value: 35.308
- type: map_at_3
value: 30.095
- type: map_at_5
value: 32.435
- type: mrr_at_1
value: 23.841
- type: mrr_at_10
value: 36.864999999999995
- type: mrr_at_100
value: 37.935
- type: mrr_at_1000
value: 37.97
- type: mrr_at_20
value: 37.566
- type: mrr_at_3
value: 32.918
- type: mrr_at_5
value: 35.11
- type: ndcg_at_1
value: 23.841
- type: ndcg_at_10
value: 42.043
- type: ndcg_at_100
value: 48.015
- type: ndcg_at_1000
value: 49.152
- type: ndcg_at_20
value: 44.936
- type: ndcg_at_3
value: 33.513999999999996
- type: ndcg_at_5
value: 37.541999999999994
- type: precision_at_1
value: 23.841
- type: precision_at_10
value: 7.454
- type: precision_at_100
value: 1.081
- type: precision_at_1000
value: 0.11900000000000001
- type: precision_at_20
value: 4.413
- type: precision_at_3
value: 15.672
- type: precision_at_5
value: 11.657
- type: recall_at_1
value: 21.123
- type: recall_at_10
value: 63.096
- type: recall_at_100
value: 89.27199999999999
- type: recall_at_1000
value: 97.69
- type: recall_at_20
value: 73.873
- type: recall_at_3
value: 40.588
- type: recall_at_5
value: 49.928
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: mteb/quora
config: default
split: test
revision: e4e08e0b7dbe3c8700f0daef558ff32256715259
metrics:
- type: map_at_1
value: 70.255
- type: map_at_10
value: 84.387
- type: map_at_100
value: 85.027
- type: map_at_1000
value: 85.043
- type: map_at_20
value: 84.809
- type: map_at_3
value: 81.5
- type: map_at_5
value: 83.286
- type: mrr_at_1
value: 80.85
- type: mrr_at_10
value: 87.25699999999999
- type: mrr_at_100
value: 87.363
- type: mrr_at_1000
value: 87.363
- type: mrr_at_20
value: 87.336
- type: mrr_at_3
value: 86.357
- type: mrr_at_5
value: 86.939
- type: ndcg_at_1
value: 80.86
- type: ndcg_at_10
value: 88.151
- type: ndcg_at_100
value: 89.381
- type: ndcg_at_1000
value: 89.47800000000001
- type: ndcg_at_20
value: 88.82100000000001
- type: ndcg_at_3
value: 85.394
- type: ndcg_at_5
value: 86.855
- type: precision_at_1
value: 80.86
- type: precision_at_10
value: 13.397
- type: precision_at_100
value: 1.5310000000000001
- type: precision_at_1000
value: 0.157
- type: precision_at_20
value: 7.106999999999999
- type: precision_at_3
value: 37.46
- type: precision_at_5
value: 24.568
- type: recall_at_1
value: 70.255
- type: recall_at_10
value: 95.405
- type: recall_at_100
value: 99.56
- type: recall_at_1000
value: 99.98599999999999
- type: recall_at_20
value: 97.544
- type: recall_at_3
value: 87.414
- type: recall_at_5
value: 91.598
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 54.7557403999403
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 385e3cb46b4cfa89021f56c4380204149d0efe33
metrics:
- type: v_measure
value: 56.2773308957202
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: mteb/scidocs
config: default
split: test
revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88
metrics:
- type: map_at_1
value: 4.123
- type: map_at_10
value: 9.940999999999999
- type: map_at_100
value: 11.928999999999998
- type: map_at_1000
value: 12.257
- type: map_at_20
value: 10.866000000000001
- type: map_at_3
value: 7.091
- type: map_at_5
value: 8.393
- type: mrr_at_1
value: 20.3
- type: mrr_at_10
value: 30.068
- type: mrr_at_100
value: 31.296000000000003
- type: mrr_at_1000
value: 31.36
- type: mrr_at_20
value: 30.756
- type: mrr_at_3
value: 26.667
- type: mrr_at_5
value: 28.616999999999997
- type: ndcg_at_1
value: 20.3
- type: ndcg_at_10
value: 17.305
- type: ndcg_at_100
value: 25.529000000000003
- type: ndcg_at_1000
value: 31.41
- type: ndcg_at_20
value: 19.967
- type: ndcg_at_3
value: 16.022
- type: ndcg_at_5
value: 14.12
- type: precision_at_1
value: 20.3
- type: precision_at_10
value: 9.06
- type: precision_at_100
value: 2.103
- type: precision_at_1000
value: 0.35200000000000004
- type: precision_at_20
value: 6.075
- type: precision_at_3
value: 14.832999999999998
- type: precision_at_5
value: 12.36
- type: recall_at_1
value: 4.123
- type: recall_at_10
value: 18.383
- type: recall_at_100
value: 42.67
- type: recall_at_1000
value: 71.44800000000001
- type: recall_at_20
value: 24.64
- type: recall_at_3
value: 9.043
- type: recall_at_5
value: 12.543000000000001
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: 20a6d6f312dd54037fe07a32d58e5e168867909d
metrics:
- type: cos_sim_pearson
value: 84.37101718384514
- type: cos_sim_spearman
value: 80.73657031880697
- type: euclidean_pearson
value: 81.42351850520845
- type: euclidean_spearman
value: 80.81452496851979
- type: manhattan_pearson
value: 81.47676252115669
- type: manhattan_spearman
value: 80.87566944708885
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 84.79559176971591
- type: cos_sim_spearman
value: 75.41866597445552
- type: euclidean_pearson
value: 83.20287101163838
- type: euclidean_spearman
value: 75.54564777571143
- type: manhattan_pearson
value: 83.24622548900163
- type: manhattan_spearman
value: 75.63826258190343
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 84.63322096299294
- type: cos_sim_spearman
value: 85.48272638914783
- type: euclidean_pearson
value: 85.57327707819331
- type: euclidean_spearman
value: 85.90735298172922
- type: manhattan_pearson
value: 85.5744191274933
- type: manhattan_spearman
value: 85.90828008488766
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 82.05530140566407
- type: cos_sim_spearman
value: 78.85454907951474
- type: euclidean_pearson
value: 81.4307311680376
- type: euclidean_spearman
value: 78.99131623529348
- type: manhattan_pearson
value: 81.46870892683134
- type: manhattan_spearman
value: 79.05473823658481
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 83.66620817683379
- type: cos_sim_spearman
value: 85.23347998035328
- type: euclidean_pearson
value: 84.59001637865366
- type: euclidean_spearman
value: 85.0081410316597
- type: manhattan_pearson
value: 84.59742325369818
- type: manhattan_spearman
value: 85.01721329704324
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 79.86344730144208
- type: cos_sim_spearman
value: 82.15966778685441
- type: euclidean_pearson
value: 81.85580574642779
- type: euclidean_spearman
value: 82.06482873417123
- type: manhattan_pearson
value: 81.82971046102377
- type: manhattan_spearman
value: 82.04185436355144
- task:
type: STS
dataset:
name: MTEB STS17
type: mteb/sts17-crosslingual-sts
config: default
split: test
revision: faeb762787bd10488a50c8b5be4a3b82e411949c
metrics:
- type: cos_sim_pearson
value: 31.440481026661672
- type: cos_sim_spearman
value: 31.592743544965913
- type: euclidean_pearson
value: 31.15111049327518
- type: euclidean_spearman
value: 30.555124184361464
- type: manhattan_pearson
value: 31.724139249295654
- type: manhattan_spearman
value: 30.483389245793504
- task:
type: STS
dataset:
name: MTEB STS22
type: mteb/sts22-crosslingual-sts
config: default
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: cos_sim_pearson
value: 34.51489724275415
- type: cos_sim_spearman
value: 47.06532141601629
- type: euclidean_pearson
value: 33.28904737503036
- type: euclidean_spearman
value: 45.111172981641865
- type: manhattan_pearson
value: 33.36374172942392
- type: manhattan_spearman
value: 45.100940945158534
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 82.09996292950329
- type: cos_sim_spearman
value: 82.69376206796092
- type: euclidean_pearson
value: 82.83254956369134
- type: euclidean_spearman
value: 82.34202999843637
- type: manhattan_pearson
value: 82.8048494319632
- type: manhattan_spearman
value: 82.34713123336984
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 82.1402269601644
- type: mrr
value: 94.84447197682492
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: mteb/scifact
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 49.138999999999996
- type: map_at_10
value: 60.288
- type: map_at_100
value: 61.082
- type: map_at_1000
value: 61.11
- type: map_at_20
value: 60.831999999999994
- type: map_at_3
value: 57.106
- type: map_at_5
value: 58.857000000000006
- type: mrr_at_1
value: 51.333
- type: mrr_at_10
value: 61.364
- type: mrr_at_100
value: 62.029999999999994
- type: mrr_at_1000
value: 62.056
- type: mrr_at_20
value: 61.85000000000001
- type: mrr_at_3
value: 58.721999999999994
- type: mrr_at_5
value: 60.221999999999994
- type: ndcg_at_1
value: 51.333
- type: ndcg_at_10
value: 65.71900000000001
- type: ndcg_at_100
value: 69.036
- type: ndcg_at_1000
value: 69.626
- type: ndcg_at_20
value: 67.571
- type: ndcg_at_3
value: 60.019
- type: ndcg_at_5
value: 62.733000000000004
- type: precision_at_1
value: 51.333
- type: precision_at_10
value: 9.067
- type: precision_at_100
value: 1.083
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_20
value: 4.95
- type: precision_at_3
value: 23.889
- type: precision_at_5
value: 16.0
- type: recall_at_1
value: 49.138999999999996
- type: recall_at_10
value: 81.256
- type: recall_at_100
value: 95.6
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 88.289
- type: recall_at_3
value: 66.078
- type: recall_at_5
value: 72.661
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.73762376237623
- type: cos_sim_ap
value: 93.02149432690442
- type: cos_sim_f1
value: 86.59079663532904
- type: cos_sim_precision
value: 85.70029382957884
- type: cos_sim_recall
value: 87.5
- type: dot_accuracy
value: 99.73267326732673
- type: dot_ap
value: 92.38661051842968
- type: dot_f1
value: 85.92283628779978
- type: dot_precision
value: 89.76034858387798
- type: dot_recall
value: 82.39999999999999
- type: euclidean_accuracy
value: 99.73960396039604
- type: euclidean_ap
value: 92.99557708360517
- type: euclidean_f1
value: 86.49183572488866
- type: euclidean_precision
value: 85.60235063663075
- type: euclidean_recall
value: 87.4
- type: manhattan_accuracy
value: 99.74059405940594
- type: manhattan_ap
value: 93.24237279644005
- type: manhattan_f1
value: 86.77727501256913
- type: manhattan_precision
value: 87.25985844287159
- type: manhattan_recall
value: 86.3
- type: max_accuracy
value: 99.74059405940594
- type: max_ap
value: 93.24237279644005
- type: max_f1
value: 86.77727501256913
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 63.94924261127149
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 32.22297034902405
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 46.12948438780115
- type: mrr
value: 46.77186783804431
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.02235612863601
- type: cos_sim_spearman
value: 30.567504287706598
- type: dot_pearson
value: 28.943978981614897
- type: dot_spearman
value: 29.905635915797358
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: mteb/trec-covid
config: default
split: test
revision: bb9466bac8153a0349341eb1b22e06409e78ef4e
metrics:
- type: map_at_1
value: 0.173
- type: map_at_10
value: 1.124
- type: map_at_100
value: 5.645
- type: map_at_1000
value: 14.965
- type: map_at_20
value: 1.876
- type: map_at_3
value: 0.45599999999999996
- type: map_at_5
value: 0.699
- type: mrr_at_1
value: 70.0
- type: mrr_at_10
value: 81.786
- type: mrr_at_100
value: 81.786
- type: mrr_at_1000
value: 81.786
- type: mrr_at_20
value: 81.786
- type: mrr_at_3
value: 80.0
- type: mrr_at_5
value: 81.5
- type: ndcg_at_1
value: 65.0
- type: ndcg_at_10
value: 53.88699999999999
- type: ndcg_at_100
value: 38.028
- type: ndcg_at_1000
value: 37.183
- type: ndcg_at_20
value: 49.286
- type: ndcg_at_3
value: 63.05
- type: ndcg_at_5
value: 59.49100000000001
- type: precision_at_1
value: 70.0
- type: precision_at_10
value: 55.400000000000006
- type: precision_at_100
value: 38.800000000000004
- type: precision_at_1000
value: 17.082
- type: precision_at_20
value: 50.7
- type: precision_at_3
value: 66.667
- type: precision_at_5
value: 62.4
- type: recall_at_1
value: 0.173
- type: recall_at_10
value: 1.353
- type: recall_at_100
value: 8.887
- type: recall_at_1000
value: 36.012
- type: recall_at_20
value: 2.476
- type: recall_at_3
value: 0.508
- type: recall_at_5
value: 0.795
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: mteb/touche2020
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 2.614
- type: map_at_10
value: 6.651999999999999
- type: map_at_100
value: 11.59
- type: map_at_1000
value: 13.044
- type: map_at_20
value: 8.702
- type: map_at_3
value: 4.159
- type: map_at_5
value: 5.327
- type: mrr_at_1
value: 30.612000000000002
- type: mrr_at_10
value: 42.664
- type: mrr_at_100
value: 43.957
- type: mrr_at_1000
value: 43.957
- type: mrr_at_20
value: 43.193
- type: mrr_at_3
value: 40.476
- type: mrr_at_5
value: 42.007
- type: ndcg_at_1
value: 27.551
- type: ndcg_at_10
value: 18.098
- type: ndcg_at_100
value: 30.019000000000002
- type: ndcg_at_1000
value: 42.179
- type: ndcg_at_20
value: 19.552
- type: ndcg_at_3
value: 21.22
- type: ndcg_at_5
value: 19.774
- type: precision_at_1
value: 30.612000000000002
- type: precision_at_10
value: 15.101999999999999
- type: precision_at_100
value: 6.510000000000001
- type: precision_at_1000
value: 1.4569999999999999
- type: precision_at_20
value: 12.449
- type: precision_at_3
value: 22.448999999999998
- type: precision_at_5
value: 19.592000000000002
- type: recall_at_1
value: 2.614
- type: recall_at_10
value: 11.068
- type: recall_at_100
value: 42.317
- type: recall_at_1000
value: 79.063
- type: recall_at_20
value: 18.589
- type: recall_at_3
value: 5.06
- type: recall_at_5
value: 7.356
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de
metrics:
- type: accuracy
value: 75.0146484375
- type: ap
value: 16.80191476928431
- type: f1
value: 58.08037205204817
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 61.80249009620826
- type: f1
value: 62.24155926661914
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 47.074846780747094
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 85.21785778148656
- type: cos_sim_ap
value: 71.06584074764645
- type: cos_sim_f1
value: 65.81720166625826
- type: cos_sim_precision
value: 61.43641354071363
- type: cos_sim_recall
value: 70.87071240105541
- type: dot_accuracy
value: 84.30589497526375
- type: dot_ap
value: 68.85872202019365
- type: dot_f1
value: 64.20295157946092
- type: dot_precision
value: 59.69607620775687
- type: dot_recall
value: 69.44591029023746
- type: euclidean_accuracy
value: 85.21189724026942
- type: euclidean_ap
value: 71.18847194129523
- type: euclidean_f1
value: 66.00049962528105
- type: euclidean_precision
value: 62.66603415559773
- type: euclidean_recall
value: 69.70976253298153
- type: manhattan_accuracy
value: 85.25958157000656
- type: manhattan_ap
value: 71.12967638566641
- type: manhattan_f1
value: 65.77477594492791
- type: manhattan_precision
value: 64.77359938603223
- type: manhattan_recall
value: 66.80738786279683
- type: max_accuracy
value: 85.25958157000656
- type: max_ap
value: 71.18847194129523
- type: max_f1
value: 66.00049962528105
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 88.22330888345559
- type: cos_sim_ap
value: 84.40304506741951
- type: cos_sim_f1
value: 76.46823520855303
- type: cos_sim_precision
value: 72.45537867824409
- type: cos_sim_recall
value: 80.95164767477672
- type: dot_accuracy
value: 87.9400007761866
- type: dot_ap
value: 83.63499141834609
- type: dot_f1
value: 75.98620939938304
- type: dot_precision
value: 71.86792064254823
- type: dot_recall
value: 80.60517400677548
- type: euclidean_accuracy
value: 88.21166608452671
- type: euclidean_ap
value: 84.40463988450605
- type: euclidean_f1
value: 76.52312831312177
- type: euclidean_precision
value: 72.40621135083138
- type: euclidean_recall
value: 81.13643363104404
- type: manhattan_accuracy
value: 88.24659448131331
- type: manhattan_ap
value: 84.42287495905447
- type: manhattan_f1
value: 76.54849595413475
- type: manhattan_precision
value: 72.39036442248302
- type: manhattan_recall
value: 81.21342777948875
- type: max_accuracy
value: 88.24659448131331
- type: max_ap
value: 84.42287495905447
- type: max_f1
value: 76.54849595413475
---
# b1ade-embed-kd
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 1024 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('{MODEL_NAME}')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('{MODEL_NAME}')
model = AutoModel.from_pretrained('{MODEL_NAME}')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
## Training
The model was distilled with teacher model as
and student model as b1ade-embed
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 275105 with parameters:
```
{'batch_size': 32, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.MSELoss.MSELoss`
Parameters of the fit()-Method:
```
{
"epochs": 3,
"evaluation_steps": 5000,
"evaluator": "sentence_transformers.evaluation.SequentialEvaluator.SequentialEvaluator",
"max_grad_norm": 1,
"optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
"optimizer_params": {
"eps": 1e-06,
"lr": 5e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 1000,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Results:
Good agreement with teacher model, at least on STS:
Teacher:
```
2024-05-20 16:29:07 - Teacher Performance:
2024-05-20 16:29:07 - EmbeddingSimilarityEvaluator: Evaluating the model on the sts-dev dataset:
2024-05-20 16:29:12 - Cosine-Similarity : Pearson: 0.8561 Spearman: 0.8597
2024-05-20 16:29:12 - Manhattan-Distance: Pearson: 0.8569 Spearman: 0.8567
2024-05-20 16:29:12 - Euclidean-Distance: Pearson: 0.8575 Spearman: 0.8571
2024-05-20 16:29:12 - Dot-Product-Similarity: Pearson: 0.8624 Spearman: 0.8662
```
Student:
```
2024-05-20 16:29:12 - Student Performance:
2024-05-20 16:29:12 - EmbeddingSimilarityEvaluator: Evaluating the model on the sts-dev dataset:
2024-05-20 16:29:17 - Cosine-Similarity : Pearson: 0.8561 Spearman: 0.8597
2024-05-20 16:29:17 - Manhattan-Distance: Pearson: 0.8569 Spearman: 0.8567
2024-05-20 16:29:17 - Euclidean-Distance: Pearson: 0.8575 Spearman: 0.8571
2024-05-20 16:29:17 - Dot-Product-Similarity: Pearson: 0.8624 Spearman: 0.8662
``` | [
"SUMMARIZATION"
]
| [
"BIOSSES",
"SCIFACT"
]
| Non_BioNLP |
# b1ade-embed-kd
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 1024 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('{MODEL_NAME}')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('{MODEL_NAME}')
model = AutoModel.from_pretrained('{MODEL_NAME}')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
## Training
The model was distilled with teacher model as
and student model as b1ade-embed
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 275105 with parameters:
```
{'batch_size': 32, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.MSELoss.MSELoss`
Parameters of the fit()-Method:
```
{
"epochs": 3,
"evaluation_steps": 5000,
"evaluator": "sentence_transformers.evaluation.SequentialEvaluator.SequentialEvaluator",
"max_grad_norm": 1,
"optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
"optimizer_params": {
"eps": 1e-06,
"lr": 5e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 1000,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Results:
Good agreement with teacher model, at least on STS:
Teacher:
```
2024-05-20 16:29:07 - Teacher Performance:
2024-05-20 16:29:07 - EmbeddingSimilarityEvaluator: Evaluating the model on the sts-dev dataset:
2024-05-20 16:29:12 - Cosine-Similarity : Pearson: 0.8561 Spearman: 0.8597
2024-05-20 16:29:12 - Manhattan-Distance: Pearson: 0.8569 Spearman: 0.8567
2024-05-20 16:29:12 - Euclidean-Distance: Pearson: 0.8575 Spearman: 0.8571
2024-05-20 16:29:12 - Dot-Product-Similarity: Pearson: 0.8624 Spearman: 0.8662
```
Student:
```
2024-05-20 16:29:12 - Student Performance:
2024-05-20 16:29:12 - EmbeddingSimilarityEvaluator: Evaluating the model on the sts-dev dataset:
2024-05-20 16:29:17 - Cosine-Similarity : Pearson: 0.8561 Spearman: 0.8597
2024-05-20 16:29:17 - Manhattan-Distance: Pearson: 0.8569 Spearman: 0.8567
2024-05-20 16:29:17 - Euclidean-Distance: Pearson: 0.8575 Spearman: 0.8571
2024-05-20 16:29:17 - Dot-Product-Similarity: Pearson: 0.8624 Spearman: 0.8662
``` | {"library_name": "sentence-transformers", "pipeline_tag": "sentence-similarity", "tags": ["mteb"], "model-index": [{"name": "b1ade_embed_kd", "results": [{"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonCounterfactualClassification", "type": "mteb/amazon_counterfactual", "config": "default", "split": "test", "revision": "e8379541af4e31359cca9fbcf4b00f2671dba205"}, "metrics": [{"type": "accuracy", "value": 75.81709145427287}, {"type": "ap", "value": 23.581082591688467}, {"type": "f1", "value": 62.54637626017967}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonPolarityClassification", "type": "mteb/amazon_polarity", "config": "default", "split": "test", "revision": "e2d317d38cd51312af73b3d32a06d1a08b442046"}, "metrics": [{"type": "accuracy", "value": 80.300125}, {"type": "ap", "value": 74.26836190039964}, {"type": "f1", "value": 80.2158066692679}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification", "type": "mteb/amazon_reviews_multi", "config": "default", "split": "test", "revision": "1399c76144fd37290681b995c656ef9b2e06e26d"}, "metrics": [{"type": "accuracy", "value": 43.084}, {"type": "f1", "value": 42.66774553366831}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ArguAna", "type": "mteb/arguana", "config": "default", "split": "test", "revision": "c22ab2a51041ffd869aaddef7af8d8215647e41a"}, "metrics": [{"type": "map_at_1", "value": 29.232000000000003}, {"type": "map_at_10", "value": 45.777}, {"type": "map_at_100", "value": 46.634}, {"type": "map_at_1000", "value": 46.64}, {"type": "map_at_20", "value": 46.489000000000004}, {"type": "map_at_3", "value": 40.861}, {"type": "map_at_5", "value": 43.659}, {"type": "mrr_at_1", "value": 30.156}, {"type": "mrr_at_10", "value": 46.141}, {"type": "mrr_at_100", "value": 46.983999999999995}, {"type": "mrr_at_1000", "value": 46.989999999999995}, {"type": "mrr_at_20", "value": 46.839}, {"type": "mrr_at_3", "value": 41.157}, {"type": "mrr_at_5", "value": 44.013000000000005}, {"type": "ndcg_at_1", "value": 29.232000000000003}, {"type": "ndcg_at_10", "value": 54.832}, {"type": "ndcg_at_100", "value": 58.303000000000004}, {"type": "ndcg_at_1000", "value": 58.451}, {"type": "ndcg_at_20", "value": 57.328}, {"type": "ndcg_at_3", "value": 44.685}, {"type": "ndcg_at_5", "value": 49.756}, {"type": "precision_at_1", "value": 29.232000000000003}, {"type": "precision_at_10", "value": 8.371}, {"type": "precision_at_100", "value": 0.985}, {"type": "precision_at_1000", "value": 0.1}, {"type": "precision_at_20", "value": 4.6690000000000005}, {"type": "precision_at_3", "value": 18.587}, {"type": "precision_at_5", "value": 13.627}, {"type": "recall_at_1", "value": 29.232000000000003}, {"type": "recall_at_10", "value": 83.71300000000001}, {"type": "recall_at_100", "value": 98.506}, {"type": "recall_at_1000", "value": 99.644}, {"type": "recall_at_20", "value": 93.38499999999999}, {"type": "recall_at_3", "value": 55.761}, {"type": "recall_at_5", "value": 68.137}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringP2P", "type": "mteb/arxiv-clustering-p2p", "config": "default", "split": "test", "revision": "a122ad7f3f0291bf49cc6f4d32aa80929df69d5d"}, "metrics": [{"type": "v_measure", "value": 45.801946024895756}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringS2S", "type": "mteb/arxiv-clustering-s2s", "config": "default", "split": "test", "revision": "f910caf1a6075f7329cdf8c1a6135696f37dbd53"}, "metrics": [{"type": "v_measure", "value": 37.639210206045206}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB AskUbuntuDupQuestions", "type": "mteb/askubuntudupquestions-reranking", "config": "default", "split": "test", "revision": "2000358ca161889fa9c082cb41daa8dcfb161a54"}, "metrics": [{"type": "map", "value": 57.589359041891576}, {"type": "mrr", "value": 70.88334872268389}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB BIOSSES", "type": "mteb/biosses-sts", "config": "default", "split": "test", "revision": "d3fb88f8f02e40887cd149695127462bbcf29b4a"}, "metrics": [{"type": "cos_sim_pearson", "value": 86.63594177060354}, {"type": "cos_sim_spearman", "value": 84.75132870687939}, {"type": "euclidean_pearson", "value": 85.05646621990854}, {"type": "euclidean_spearman", "value": 84.68686940680522}, {"type": "manhattan_pearson", "value": 85.08705700579426}, {"type": "manhattan_spearman", "value": 84.83446313127413}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB Banking77Classification", "type": "mteb/banking77", "config": "default", "split": "test", "revision": "0fd18e25b25c072e09e0d92ab615fda904d66300"}, "metrics": [{"type": "accuracy", "value": 85.1948051948052}, {"type": "f1", "value": 85.13229898343104}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringP2P", "type": "mteb/biorxiv-clustering-p2p", "config": "default", "split": "test", "revision": "65b79d1d13f80053f67aca9498d9402c2d9f1f40"}, "metrics": [{"type": "v_measure", "value": 38.68616898014911}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringS2S", "type": "mteb/biorxiv-clustering-s2s", "config": "default", "split": "test", "revision": "258694dd0231531bc1fd9de6ceb52a0853c6d908"}, "metrics": [{"type": "v_measure", "value": 34.45376891835619}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackAndroidRetrieval", "type": "mteb/cqadupstack-android", "config": "default", "split": "test", "revision": "f46a197baaae43b4f621051089b82a364682dfeb"}, "metrics": [{"type": "map_at_1", "value": 26.340000000000003}, {"type": "map_at_10", "value": 36.513}, {"type": "map_at_100", "value": 37.968}, {"type": "map_at_1000", "value": 38.107}, {"type": "map_at_20", "value": 37.355}, {"type": "map_at_3", "value": 33.153}, {"type": "map_at_5", "value": 34.899}, {"type": "mrr_at_1", "value": 33.763}, {"type": "mrr_at_10", "value": 42.778}, {"type": "mrr_at_100", "value": 43.667}, {"type": "mrr_at_1000", "value": 43.724000000000004}, {"type": "mrr_at_20", "value": 43.349}, {"type": "mrr_at_3", "value": 40.32}, {"type": "mrr_at_5", "value": 41.657}, {"type": "ndcg_at_1", "value": 33.763}, {"type": "ndcg_at_10", "value": 42.783}, {"type": "ndcg_at_100", "value": 48.209999999999994}, {"type": "ndcg_at_1000", "value": 50.678999999999995}, {"type": "ndcg_at_20", "value": 45.073}, {"type": "ndcg_at_3", "value": 37.841}, {"type": "ndcg_at_5", "value": 39.818999999999996}, {"type": "precision_at_1", "value": 33.763}, {"type": "precision_at_10", "value": 8.398}, {"type": "precision_at_100", "value": 1.396}, {"type": "precision_at_1000", "value": 0.188}, {"type": "precision_at_20", "value": 5.0569999999999995}, {"type": "precision_at_3", "value": 18.503}, {"type": "precision_at_5", "value": 13.219}, {"type": "recall_at_1", "value": 26.340000000000003}, {"type": "recall_at_10", "value": 54.603}, {"type": "recall_at_100", "value": 77.264}, {"type": "recall_at_1000", "value": 93.882}, {"type": "recall_at_20", "value": 63.101}, {"type": "recall_at_3", "value": 39.6}, {"type": "recall_at_5", "value": 45.651}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackEnglishRetrieval", "type": "mteb/cqadupstack-english", "config": "default", "split": "test", "revision": "ad9991cb51e31e31e430383c75ffb2885547b5f0"}, "metrics": [{"type": "map_at_1", "value": 24.313000000000002}, {"type": "map_at_10", "value": 33.225}, {"type": "map_at_100", "value": 34.293}, {"type": "map_at_1000", "value": 34.421}, {"type": "map_at_20", "value": 33.818}, {"type": "map_at_3", "value": 30.545}, {"type": "map_at_5", "value": 32.144}, {"type": "mrr_at_1", "value": 31.083}, {"type": "mrr_at_10", "value": 39.199}, {"type": "mrr_at_100", "value": 39.835}, {"type": "mrr_at_1000", "value": 39.892}, {"type": "mrr_at_20", "value": 39.57}, {"type": "mrr_at_3", "value": 36.879}, {"type": "mrr_at_5", "value": 38.245000000000005}, {"type": "ndcg_at_1", "value": 31.083}, {"type": "ndcg_at_10", "value": 38.553}, {"type": "ndcg_at_100", "value": 42.685}, {"type": "ndcg_at_1000", "value": 45.144}, {"type": "ndcg_at_20", "value": 40.116}, {"type": "ndcg_at_3", "value": 34.608}, {"type": "ndcg_at_5", "value": 36.551}, {"type": "precision_at_1", "value": 31.083}, {"type": "precision_at_10", "value": 7.28}, {"type": "precision_at_100", "value": 1.183}, {"type": "precision_at_1000", "value": 0.168}, {"type": "precision_at_20", "value": 4.322}, {"type": "precision_at_3", "value": 16.858}, {"type": "precision_at_5", "value": 12.127}, {"type": "recall_at_1", "value": 24.313000000000002}, {"type": "recall_at_10", "value": 48.117}, {"type": "recall_at_100", "value": 65.768}, {"type": "recall_at_1000", "value": 81.935}, {"type": "recall_at_20", "value": 53.689}, {"type": "recall_at_3", "value": 36.335}, {"type": "recall_at_5", "value": 41.803000000000004}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGamingRetrieval", "type": "mteb/cqadupstack-gaming", "config": "default", "split": "test", "revision": "4885aa143210c98657558c04aaf3dc47cfb54340"}, "metrics": [{"type": "map_at_1", "value": 33.013999999999996}, {"type": "map_at_10", "value": 44.567}, {"type": "map_at_100", "value": 45.664}, {"type": "map_at_1000", "value": 45.732}, {"type": "map_at_20", "value": 45.190000000000005}, {"type": "map_at_3", "value": 41.393}, {"type": "map_at_5", "value": 43.147000000000006}, {"type": "mrr_at_1", "value": 37.806}, {"type": "mrr_at_10", "value": 47.841}, {"type": "mrr_at_100", "value": 48.597}, {"type": "mrr_at_1000", "value": 48.638}, {"type": "mrr_at_20", "value": 48.262}, {"type": "mrr_at_3", "value": 45.361000000000004}, {"type": "mrr_at_5", "value": 46.803}, {"type": "ndcg_at_1", "value": 37.806}, {"type": "ndcg_at_10", "value": 50.412}, {"type": "ndcg_at_100", "value": 55.019}, {"type": "ndcg_at_1000", "value": 56.52}, {"type": "ndcg_at_20", "value": 52.23100000000001}, {"type": "ndcg_at_3", "value": 44.944}, {"type": "ndcg_at_5", "value": 47.535}, {"type": "precision_at_1", "value": 37.806}, {"type": "precision_at_10", "value": 8.351}, {"type": "precision_at_100", "value": 1.163}, {"type": "precision_at_1000", "value": 0.134}, {"type": "precision_at_20", "value": 4.727}, {"type": "precision_at_3", "value": 20.376}, {"type": "precision_at_5", "value": 14.056}, {"type": "recall_at_1", "value": 33.013999999999996}, {"type": "recall_at_10", "value": 64.35600000000001}, {"type": "recall_at_100", "value": 84.748}, {"type": "recall_at_1000", "value": 95.525}, {"type": "recall_at_20", "value": 71.137}, {"type": "recall_at_3", "value": 49.726}, {"type": "recall_at_5", "value": 56.054}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGisRetrieval", "type": "mteb/cqadupstack-gis", "config": "default", "split": "test", "revision": "5003b3064772da1887988e05400cf3806fe491f2"}, "metrics": [{"type": "map_at_1", "value": 18.476}, {"type": "map_at_10", "value": 24.715999999999998}, {"type": "map_at_100", "value": 25.72}, {"type": "map_at_1000", "value": 25.826999999999998}, {"type": "map_at_20", "value": 25.276}, {"type": "map_at_3", "value": 22.656000000000002}, {"type": "map_at_5", "value": 23.737}, {"type": "mrr_at_1", "value": 20.113}, {"type": "mrr_at_10", "value": 26.423999999999996}, {"type": "mrr_at_100", "value": 27.328000000000003}, {"type": "mrr_at_1000", "value": 27.418}, {"type": "mrr_at_20", "value": 26.936}, {"type": "mrr_at_3", "value": 24.275}, {"type": "mrr_at_5", "value": 25.501}, {"type": "ndcg_at_1", "value": 20.113}, {"type": "ndcg_at_10", "value": 28.626}, {"type": "ndcg_at_100", "value": 33.649}, {"type": "ndcg_at_1000", "value": 36.472}, {"type": "ndcg_at_20", "value": 30.581999999999997}, {"type": "ndcg_at_3", "value": 24.490000000000002}, {"type": "ndcg_at_5", "value": 26.394000000000002}, {"type": "precision_at_1", "value": 20.113}, {"type": "precision_at_10", "value": 4.52}, {"type": "precision_at_100", "value": 0.739}, {"type": "precision_at_1000", "value": 0.10200000000000001}, {"type": "precision_at_20", "value": 2.706}, {"type": "precision_at_3", "value": 10.433}, {"type": "precision_at_5", "value": 7.48}, {"type": "recall_at_1", "value": 18.476}, {"type": "recall_at_10", "value": 39.129000000000005}, {"type": "recall_at_100", "value": 62.44}, {"type": "recall_at_1000", "value": 83.95700000000001}, {"type": "recall_at_20", "value": 46.611999999999995}, {"type": "recall_at_3", "value": 27.772000000000002}, {"type": "recall_at_5", "value": 32.312000000000005}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackMathematicaRetrieval", "type": "mteb/cqadupstack-mathematica", "config": "default", "split": "test", "revision": "90fceea13679c63fe563ded68f3b6f06e50061de"}, "metrics": [{"type": "map_at_1", "value": 10.126}, {"type": "map_at_10", "value": 15.916}, {"type": "map_at_100", "value": 17.049}, {"type": "map_at_1000", "value": 17.19}, {"type": "map_at_20", "value": 16.569}, {"type": "map_at_3", "value": 13.986}, {"type": "map_at_5", "value": 15.052999999999999}, {"type": "mrr_at_1", "value": 13.059999999999999}, {"type": "mrr_at_10", "value": 19.52}, {"type": "mrr_at_100", "value": 20.599999999999998}, {"type": "mrr_at_1000", "value": 20.693}, {"type": "mrr_at_20", "value": 20.177999999999997}, {"type": "mrr_at_3", "value": 17.496000000000002}, {"type": "mrr_at_5", "value": 18.541}, {"type": "ndcg_at_1", "value": 13.059999999999999}, {"type": "ndcg_at_10", "value": 19.987}, {"type": "ndcg_at_100", "value": 25.602000000000004}, {"type": "ndcg_at_1000", "value": 29.171999999999997}, {"type": "ndcg_at_20", "value": 22.31}, {"type": "ndcg_at_3", "value": 16.286}, {"type": "ndcg_at_5", "value": 17.931}, {"type": "precision_at_1", "value": 13.059999999999999}, {"type": "precision_at_10", "value": 3.9050000000000002}, {"type": "precision_at_100", "value": 0.771}, {"type": "precision_at_1000", "value": 0.123}, {"type": "precision_at_20", "value": 2.606}, {"type": "precision_at_3", "value": 8.167}, {"type": "precision_at_5", "value": 6.045}, {"type": "recall_at_1", "value": 10.126}, {"type": "recall_at_10", "value": 29.137}, {"type": "recall_at_100", "value": 53.824000000000005}, {"type": "recall_at_1000", "value": 79.373}, {"type": "recall_at_20", "value": 37.475}, {"type": "recall_at_3", "value": 18.791}, {"type": "recall_at_5", "value": 22.993}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackPhysicsRetrieval", "type": "mteb/cqadupstack-physics", "config": "default", "split": "test", "revision": "79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4"}, "metrics": [{"type": "map_at_1", "value": 25.281}, {"type": "map_at_10", "value": 34.875}, {"type": "map_at_100", "value": 36.268}, {"type": "map_at_1000", "value": 36.385}, {"type": "map_at_20", "value": 35.711999999999996}, {"type": "map_at_3", "value": 31.808999999999997}, {"type": "map_at_5", "value": 33.550999999999995}, {"type": "mrr_at_1", "value": 31.28}, {"type": "mrr_at_10", "value": 40.489000000000004}, {"type": "mrr_at_100", "value": 41.434}, {"type": "mrr_at_1000", "value": 41.491}, {"type": "mrr_at_20", "value": 41.088}, {"type": "mrr_at_3", "value": 38.033}, {"type": "mrr_at_5", "value": 39.621}, {"type": "ndcg_at_1", "value": 31.28}, {"type": "ndcg_at_10", "value": 40.716}, {"type": "ndcg_at_100", "value": 46.45}, {"type": "ndcg_at_1000", "value": 48.851}, {"type": "ndcg_at_20", "value": 43.216}, {"type": "ndcg_at_3", "value": 35.845}, {"type": "ndcg_at_5", "value": 38.251000000000005}, {"type": "precision_at_1", "value": 31.28}, {"type": "precision_at_10", "value": 7.623}, {"type": "precision_at_100", "value": 1.214}, {"type": "precision_at_1000", "value": 0.159}, {"type": "precision_at_20", "value": 4.625}, {"type": "precision_at_3", "value": 17.26}, {"type": "precision_at_5", "value": 12.435}, {"type": "recall_at_1", "value": 25.281}, {"type": "recall_at_10", "value": 52.476}, {"type": "recall_at_100", "value": 76.535}, {"type": "recall_at_1000", "value": 92.658}, {"type": "recall_at_20", "value": 61.211000000000006}, {"type": "recall_at_3", "value": 38.805}, {"type": "recall_at_5", "value": 45.053}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackProgrammersRetrieval", "type": "mteb/cqadupstack-programmers", "config": "default", "split": "test", "revision": "6184bc1440d2dbc7612be22b50686b8826d22b32"}, "metrics": [{"type": "map_at_1", "value": 20.092}, {"type": "map_at_10", "value": 27.805999999999997}, {"type": "map_at_100", "value": 29.137999999999998}, {"type": "map_at_1000", "value": 29.266}, {"type": "map_at_20", "value": 28.587}, {"type": "map_at_3", "value": 25.112000000000002}, {"type": "map_at_5", "value": 26.551000000000002}, {"type": "mrr_at_1", "value": 24.315}, {"type": "mrr_at_10", "value": 32.068000000000005}, {"type": "mrr_at_100", "value": 33.039}, {"type": "mrr_at_1000", "value": 33.114}, {"type": "mrr_at_20", "value": 32.66}, {"type": "mrr_at_3", "value": 29.49}, {"type": "mrr_at_5", "value": 30.906}, {"type": "ndcg_at_1", "value": 24.315}, {"type": "ndcg_at_10", "value": 32.9}, {"type": "ndcg_at_100", "value": 38.741}, {"type": "ndcg_at_1000", "value": 41.657}, {"type": "ndcg_at_20", "value": 35.338}, {"type": "ndcg_at_3", "value": 28.069}, {"type": "ndcg_at_5", "value": 30.169}, {"type": "precision_at_1", "value": 24.315}, {"type": "precision_at_10", "value": 6.2330000000000005}, {"type": "precision_at_100", "value": 1.072}, {"type": "precision_at_1000", "value": 0.15}, {"type": "precision_at_20", "value": 3.8580000000000005}, {"type": "precision_at_3", "value": 13.318}, {"type": "precision_at_5", "value": 9.748999999999999}, {"type": "recall_at_1", "value": 20.092}, {"type": "recall_at_10", "value": 43.832}, {"type": "recall_at_100", "value": 68.75099999999999}, {"type": "recall_at_1000", "value": 89.25}, {"type": "recall_at_20", "value": 52.445}, {"type": "recall_at_3", "value": 30.666}, {"type": "recall_at_5", "value": 35.873}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackRetrieval", "type": "mteb/cqadupstack", "config": "default", "split": "test", "revision": "160c094312a0e1facb97e55eeddb698c0abe3571"}, "metrics": [{"type": "map_at_1", "value": 19.317}, {"type": "map_at_10", "value": 26.653}, {"type": "map_at_100", "value": 28.011999999999997}, {"type": "map_at_1000", "value": 28.231}, {"type": "map_at_20", "value": 27.301}, {"type": "map_at_3", "value": 23.763}, {"type": "map_at_5", "value": 25.391000000000002}, {"type": "mrr_at_1", "value": 24.506}, {"type": "mrr_at_10", "value": 31.991999999999997}, {"type": "mrr_at_100", "value": 32.924}, {"type": "mrr_at_1000", "value": 32.993}, {"type": "mrr_at_20", "value": 32.521}, {"type": "mrr_at_3", "value": 29.48}, {"type": "mrr_at_5", "value": 30.982}, {"type": "ndcg_at_1", "value": 24.506}, {"type": "ndcg_at_10", "value": 32.202999999999996}, {"type": "ndcg_at_100", "value": 37.797}, {"type": "ndcg_at_1000", "value": 40.859}, {"type": "ndcg_at_20", "value": 34.098}, {"type": "ndcg_at_3", "value": 27.552}, {"type": "ndcg_at_5", "value": 29.781000000000002}, {"type": "precision_at_1", "value": 24.506}, {"type": "precision_at_10", "value": 6.462}, {"type": "precision_at_100", "value": 1.35}, {"type": "precision_at_1000", "value": 0.22499999999999998}, {"type": "precision_at_20", "value": 4.071000000000001}, {"type": "precision_at_3", "value": 13.241}, {"type": "precision_at_5", "value": 9.921000000000001}, {"type": "recall_at_1", "value": 19.317}, {"type": "recall_at_10", "value": 42.296}, {"type": "recall_at_100", "value": 68.2}, {"type": "recall_at_1000", "value": 88.565}, {"type": "recall_at_20", "value": 49.883}, {"type": "recall_at_3", "value": 28.608}, {"type": "recall_at_5", "value": 34.854}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackStatsRetrieval", "type": "mteb/cqadupstack-stats", "config": "default", "split": "test", "revision": "65ac3a16b8e91f9cee4c9828cc7c335575432a2a"}, "metrics": [{"type": "map_at_1", "value": 18.0}, {"type": "map_at_10", "value": 24.444}, {"type": "map_at_100", "value": 25.205}, {"type": "map_at_1000", "value": 25.291000000000004}, {"type": "map_at_20", "value": 24.834}, {"type": "map_at_3", "value": 22.311}, {"type": "map_at_5", "value": 23.442}, {"type": "mrr_at_1", "value": 20.552}, {"type": "mrr_at_10", "value": 27.028999999999996}, {"type": "mrr_at_100", "value": 27.706999999999997}, {"type": "mrr_at_1000", "value": 27.775}, {"type": "mrr_at_20", "value": 27.366}, {"type": "mrr_at_3", "value": 25.051000000000002}, {"type": "mrr_at_5", "value": 26.063}, {"type": "ndcg_at_1", "value": 20.552}, {"type": "ndcg_at_10", "value": 28.519}, {"type": "ndcg_at_100", "value": 32.580999999999996}, {"type": "ndcg_at_1000", "value": 34.99}, {"type": "ndcg_at_20", "value": 29.848000000000003}, {"type": "ndcg_at_3", "value": 24.46}, {"type": "ndcg_at_5", "value": 26.273000000000003}, {"type": "precision_at_1", "value": 20.552}, {"type": "precision_at_10", "value": 4.801}, {"type": "precision_at_100", "value": 0.729}, {"type": "precision_at_1000", "value": 0.101}, {"type": "precision_at_20", "value": 2.715}, {"type": "precision_at_3", "value": 10.940999999999999}, {"type": "precision_at_5", "value": 7.761}, {"type": "recall_at_1", "value": 18.0}, {"type": "recall_at_10", "value": 38.425}, {"type": "recall_at_100", "value": 57.885}, {"type": "recall_at_1000", "value": 75.945}, {"type": "recall_at_20", "value": 43.472}, {"type": "recall_at_3", "value": 27.483}, {"type": "recall_at_5", "value": 31.866}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackTexRetrieval", "type": "mteb/cqadupstack-tex", "config": "default", "split": "test", "revision": "46989137a86843e03a6195de44b09deda022eec7"}, "metrics": [{"type": "map_at_1", "value": 10.014000000000001}, {"type": "map_at_10", "value": 14.462}, {"type": "map_at_100", "value": 15.364}, {"type": "map_at_1000", "value": 15.482999999999999}, {"type": "map_at_20", "value": 14.931}, {"type": "map_at_3", "value": 12.842}, {"type": "map_at_5", "value": 13.697999999999999}, {"type": "mrr_at_1", "value": 12.526000000000002}, {"type": "mrr_at_10", "value": 17.433}, {"type": "mrr_at_100", "value": 18.296}, {"type": "mrr_at_1000", "value": 18.383}, {"type": "mrr_at_20", "value": 17.897}, {"type": "mrr_at_3", "value": 15.703}, {"type": "mrr_at_5", "value": 16.627}, {"type": "ndcg_at_1", "value": 12.526000000000002}, {"type": "ndcg_at_10", "value": 17.697}, {"type": "ndcg_at_100", "value": 22.33}, {"type": "ndcg_at_1000", "value": 25.587}, {"type": "ndcg_at_20", "value": 19.302}, {"type": "ndcg_at_3", "value": 14.606}, {"type": "ndcg_at_5", "value": 15.946}, {"type": "precision_at_1", "value": 12.526000000000002}, {"type": "precision_at_10", "value": 3.383}, {"type": "precision_at_100", "value": 0.6799999999999999}, {"type": "precision_at_1000", "value": 0.11199999999999999}, {"type": "precision_at_20", "value": 2.147}, {"type": "precision_at_3", "value": 7.02}, {"type": "precision_at_5", "value": 5.196}, {"type": "recall_at_1", "value": 10.014000000000001}, {"type": "recall_at_10", "value": 24.623}, {"type": "recall_at_100", "value": 45.795}, {"type": "recall_at_1000", "value": 69.904}, {"type": "recall_at_20", "value": 30.534}, {"type": "recall_at_3", "value": 15.955}, {"type": "recall_at_5", "value": 19.394}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackUnixRetrieval", "type": "mteb/cqadupstack-unix", "config": "default", "split": "test", "revision": "6c6430d3a6d36f8d2a829195bc5dc94d7e063e53"}, "metrics": [{"type": "map_at_1", "value": 19.156000000000002}, {"type": "map_at_10", "value": 26.144000000000002}, {"type": "map_at_100", "value": 27.157999999999998}, {"type": "map_at_1000", "value": 27.288}, {"type": "map_at_20", "value": 26.689}, {"type": "map_at_3", "value": 24.125}, {"type": "map_at_5", "value": 25.369000000000003}, {"type": "mrr_at_1", "value": 22.854}, {"type": "mrr_at_10", "value": 29.874000000000002}, {"type": "mrr_at_100", "value": 30.738}, {"type": "mrr_at_1000", "value": 30.826999999999998}, {"type": "mrr_at_20", "value": 30.354}, {"type": "mrr_at_3", "value": 27.689999999999998}, {"type": "mrr_at_5", "value": 29.131}, {"type": "ndcg_at_1", "value": 22.854}, {"type": "ndcg_at_10", "value": 30.469}, {"type": "ndcg_at_100", "value": 35.475}, {"type": "ndcg_at_1000", "value": 38.59}, {"type": "ndcg_at_20", "value": 32.333}, {"type": "ndcg_at_3", "value": 26.674999999999997}, {"type": "ndcg_at_5", "value": 28.707}, {"type": "precision_at_1", "value": 22.854}, {"type": "precision_at_10", "value": 5.1209999999999996}, {"type": "precision_at_100", "value": 0.8500000000000001}, {"type": "precision_at_1000", "value": 0.123}, {"type": "precision_at_20", "value": 3.0460000000000003}, {"type": "precision_at_3", "value": 12.127}, {"type": "precision_at_5", "value": 8.75}, {"type": "recall_at_1", "value": 19.156000000000002}, {"type": "recall_at_10", "value": 40.009}, {"type": "recall_at_100", "value": 62.419999999999995}, {"type": "recall_at_1000", "value": 84.585}, {"type": "recall_at_20", "value": 46.912}, {"type": "recall_at_3", "value": 29.733999999999998}, {"type": "recall_at_5", "value": 34.741}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWebmastersRetrieval", "type": "mteb/cqadupstack-webmasters", "config": "default", "split": "test", "revision": "160c094312a0e1facb97e55eeddb698c0abe3571"}, "metrics": [{"type": "map_at_1", "value": 19.317}, {"type": "map_at_10", "value": 26.653}, {"type": "map_at_100", "value": 28.011999999999997}, {"type": "map_at_1000", "value": 28.231}, {"type": "map_at_20", "value": 27.301}, {"type": "map_at_3", "value": 23.763}, {"type": "map_at_5", "value": 25.391000000000002}, {"type": "mrr_at_1", "value": 24.506}, {"type": "mrr_at_10", "value": 31.991999999999997}, {"type": "mrr_at_100", "value": 32.924}, {"type": "mrr_at_1000", "value": 32.993}, {"type": "mrr_at_20", "value": 32.521}, {"type": "mrr_at_3", "value": 29.48}, {"type": "mrr_at_5", "value": 30.982}, {"type": "ndcg_at_1", "value": 24.506}, {"type": "ndcg_at_10", "value": 32.202999999999996}, {"type": "ndcg_at_100", "value": 37.797}, {"type": "ndcg_at_1000", "value": 40.859}, {"type": "ndcg_at_20", "value": 34.098}, {"type": "ndcg_at_3", "value": 27.552}, {"type": "ndcg_at_5", "value": 29.781000000000002}, {"type": "precision_at_1", "value": 24.506}, {"type": "precision_at_10", "value": 6.462}, {"type": "precision_at_100", "value": 1.35}, {"type": "precision_at_1000", "value": 0.22499999999999998}, {"type": "precision_at_20", "value": 4.071000000000001}, {"type": "precision_at_3", "value": 13.241}, {"type": "precision_at_5", "value": 9.921000000000001}, {"type": "recall_at_1", "value": 19.317}, {"type": "recall_at_10", "value": 42.296}, {"type": "recall_at_100", "value": 68.2}, {"type": "recall_at_1000", "value": 88.565}, {"type": "recall_at_20", "value": 49.883}, {"type": "recall_at_3", "value": 28.608}, {"type": "recall_at_5", "value": 34.854}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWordpressRetrieval", "type": "mteb/cqadupstack-wordpress", "config": "default", "split": "test", "revision": "4ffe81d471b1924886b33c7567bfb200e9eec5c4"}, "metrics": [{"type": "map_at_1", "value": 12.822}, {"type": "map_at_10", "value": 18.055}, {"type": "map_at_100", "value": 18.942}, {"type": "map_at_1000", "value": 19.057}, {"type": "map_at_20", "value": 18.544}, {"type": "map_at_3", "value": 15.964}, {"type": "map_at_5", "value": 16.833000000000002}, {"type": "mrr_at_1", "value": 14.048}, {"type": "mrr_at_10", "value": 19.489}, {"type": "mrr_at_100", "value": 20.392}, {"type": "mrr_at_1000", "value": 20.49}, {"type": "mrr_at_20", "value": 19.979}, {"type": "mrr_at_3", "value": 17.344}, {"type": "mrr_at_5", "value": 18.287}, {"type": "ndcg_at_1", "value": 14.048}, {"type": "ndcg_at_10", "value": 21.737000000000002}, {"type": "ndcg_at_100", "value": 26.383000000000003}, {"type": "ndcg_at_1000", "value": 29.555}, {"type": "ndcg_at_20", "value": 23.463}, {"type": "ndcg_at_3", "value": 17.29}, {"type": "ndcg_at_5", "value": 18.829}, {"type": "precision_at_1", "value": 14.048}, {"type": "precision_at_10", "value": 3.6229999999999998}, {"type": "precision_at_100", "value": 0.641}, {"type": "precision_at_1000", "value": 0.099}, {"type": "precision_at_20", "value": 2.1999999999999997}, {"type": "precision_at_3", "value": 7.2090000000000005}, {"type": "precision_at_5", "value": 5.213}, {"type": "recall_at_1", "value": 12.822}, {"type": "recall_at_10", "value": 32.123000000000005}, {"type": "recall_at_100", "value": 53.657999999999994}, {"type": "recall_at_1000", "value": 77.72200000000001}, {"type": "recall_at_20", "value": 38.66}, {"type": "recall_at_3", "value": 19.814999999999998}, {"type": "recall_at_5", "value": 23.432}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ClimateFEVER", "type": "mteb/climate-fever", "config": "default", "split": "test", "revision": "47f2ac6acb640fc46020b02a5b59fdda04d39380"}, "metrics": [{"type": "map_at_1", "value": 13.119}, {"type": "map_at_10", "value": 22.999}, {"type": "map_at_100", "value": 25.108000000000004}, {"type": "map_at_1000", "value": 25.306}, {"type": "map_at_20", "value": 24.141000000000002}, {"type": "map_at_3", "value": 19.223000000000003}, {"type": "map_at_5", "value": 21.181}, {"type": "mrr_at_1", "value": 30.554}, {"type": "mrr_at_10", "value": 42.553000000000004}, {"type": "mrr_at_100", "value": 43.498}, {"type": "mrr_at_1000", "value": 43.527}, {"type": "mrr_at_20", "value": 43.193}, {"type": "mrr_at_3", "value": 39.283}, {"type": "mrr_at_5", "value": 41.143}, {"type": "ndcg_at_1", "value": 30.554}, {"type": "ndcg_at_10", "value": 31.946}, {"type": "ndcg_at_100", "value": 39.934999999999995}, {"type": "ndcg_at_1000", "value": 43.256}, {"type": "ndcg_at_20", "value": 35.101}, {"type": "ndcg_at_3", "value": 26.489}, {"type": "ndcg_at_5", "value": 28.272000000000002}, {"type": "precision_at_1", "value": 30.554}, {"type": "precision_at_10", "value": 10.039}, {"type": "precision_at_100", "value": 1.864}, {"type": "precision_at_1000", "value": 0.248}, {"type": "precision_at_20", "value": 6.371}, {"type": "precision_at_3", "value": 20.174}, {"type": "precision_at_5", "value": 15.296000000000001}, {"type": "recall_at_1", "value": 13.119}, {"type": "recall_at_10", "value": 37.822}, {"type": "recall_at_100", "value": 65.312}, {"type": "recall_at_1000", "value": 83.817}, {"type": "recall_at_20", "value": 46.760000000000005}, {"type": "recall_at_3", "value": 23.858999999999998}, {"type": "recall_at_5", "value": 29.609999999999996}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB DBPedia", "type": "mteb/dbpedia", "config": "default", "split": "test", "revision": "c0f706b76e590d620bd6618b3ca8efdd34e2d659"}, "metrics": [{"type": "map_at_1", "value": 8.176}, {"type": "map_at_10", "value": 19.594}, {"type": "map_at_100", "value": 28.081}, {"type": "map_at_1000", "value": 29.864}, {"type": "map_at_20", "value": 22.983999999999998}, {"type": "map_at_3", "value": 13.923}, {"type": "map_at_5", "value": 16.597}, {"type": "mrr_at_1", "value": 66.75}, {"type": "mrr_at_10", "value": 75.82600000000001}, {"type": "mrr_at_100", "value": 76.145}, {"type": "mrr_at_1000", "value": 76.14999999999999}, {"type": "mrr_at_20", "value": 76.074}, {"type": "mrr_at_3", "value": 74.333}, {"type": "mrr_at_5", "value": 75.25800000000001}, {"type": "ndcg_at_1", "value": 54.50000000000001}, {"type": "ndcg_at_10", "value": 41.806}, {"type": "ndcg_at_100", "value": 47.067}, {"type": "ndcg_at_1000", "value": 54.397}, {"type": "ndcg_at_20", "value": 41.727}, {"type": "ndcg_at_3", "value": 46.92}, {"type": "ndcg_at_5", "value": 44.381}, {"type": "precision_at_1", "value": 66.75}, {"type": "precision_at_10", "value": 33.35}, {"type": "precision_at_100", "value": 10.92}, {"type": "precision_at_1000", "value": 2.222}, {"type": "precision_at_20", "value": 25.862000000000002}, {"type": "precision_at_3", "value": 51.417}, {"type": "precision_at_5", "value": 43.65}, {"type": "recall_at_1", "value": 8.176}, {"type": "recall_at_10", "value": 26.029000000000003}, {"type": "recall_at_100", "value": 53.872}, {"type": "recall_at_1000", "value": 76.895}, {"type": "recall_at_20", "value": 34.192}, {"type": "recall_at_3", "value": 15.789}, {"type": "recall_at_5", "value": 20.255000000000003}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB EmotionClassification", "type": "mteb/emotion", "config": "default", "split": "test", "revision": "4f58c6b202a23cf9a4da393831edf4f9183cad37"}, "metrics": [{"type": "accuracy", "value": 48.22}, {"type": "f1", "value": 43.59074485488622}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FEVER", "type": "mteb/fever", "config": "default", "split": "test", "revision": "bea83ef9e8fb933d90a2f1d5515737465d613e12"}, "metrics": [{"type": "map_at_1", "value": 40.872}, {"type": "map_at_10", "value": 55.178000000000004}, {"type": "map_at_100", "value": 55.859}, {"type": "map_at_1000", "value": 55.881}, {"type": "map_at_20", "value": 55.66}, {"type": "map_at_3", "value": 51.4}, {"type": "map_at_5", "value": 53.754000000000005}, {"type": "mrr_at_1", "value": 43.744}, {"type": "mrr_at_10", "value": 58.36900000000001}, {"type": "mrr_at_100", "value": 58.911}, {"type": "mrr_at_1000", "value": 58.916999999999994}, {"type": "mrr_at_20", "value": 58.779}, {"type": "mrr_at_3", "value": 54.653}, {"type": "mrr_at_5", "value": 56.987}, {"type": "ndcg_at_1", "value": 43.744}, {"type": "ndcg_at_10", "value": 62.936}, {"type": "ndcg_at_100", "value": 65.666}, {"type": "ndcg_at_1000", "value": 66.08699999999999}, {"type": "ndcg_at_20", "value": 64.548}, {"type": "ndcg_at_3", "value": 55.543}, {"type": "ndcg_at_5", "value": 59.646}, {"type": "precision_at_1", "value": 43.744}, {"type": "precision_at_10", "value": 9.191}, {"type": "precision_at_100", "value": 1.072}, {"type": "precision_at_1000", "value": 0.11299999999999999}, {"type": "precision_at_20", "value": 4.967}, {"type": "precision_at_3", "value": 23.157}, {"type": "precision_at_5", "value": 16.115}, {"type": "recall_at_1", "value": 40.872}, {"type": "recall_at_10", "value": 83.818}, {"type": "recall_at_100", "value": 95.14200000000001}, {"type": "recall_at_1000", "value": 97.897}, {"type": "recall_at_20", "value": 89.864}, {"type": "recall_at_3", "value": 64.19200000000001}, {"type": "recall_at_5", "value": 74.029}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FiQA2018", "type": "mteb/fiqa", "config": "default", "split": "test", "revision": "27a168819829fe9bcd655c2df245fb19452e8e06"}, "metrics": [{"type": "map_at_1", "value": 14.804999999999998}, {"type": "map_at_10", "value": 22.86}, {"type": "map_at_100", "value": 24.823999999999998}, {"type": "map_at_1000", "value": 25.041000000000004}, {"type": "map_at_20", "value": 23.881}, {"type": "map_at_3", "value": 20.09}, {"type": "map_at_5", "value": 21.39}, {"type": "mrr_at_1", "value": 29.938}, {"type": "mrr_at_10", "value": 37.041000000000004}, {"type": "mrr_at_100", "value": 38.196000000000005}, {"type": "mrr_at_1000", "value": 38.256}, {"type": "mrr_at_20", "value": 37.693}, {"type": "mrr_at_3", "value": 34.721999999999994}, {"type": "mrr_at_5", "value": 35.787}, {"type": "ndcg_at_1", "value": 29.938}, {"type": "ndcg_at_10", "value": 29.358}, {"type": "ndcg_at_100", "value": 37.544}, {"type": "ndcg_at_1000", "value": 41.499}, {"type": "ndcg_at_20", "value": 32.354}, {"type": "ndcg_at_3", "value": 26.434}, {"type": "ndcg_at_5", "value": 26.93}, {"type": "precision_at_1", "value": 29.938}, {"type": "precision_at_10", "value": 8.117}, {"type": "precision_at_100", "value": 1.611}, {"type": "precision_at_1000", "value": 0.232}, {"type": "precision_at_20", "value": 5.255}, {"type": "precision_at_3", "value": 17.49}, {"type": "precision_at_5", "value": 12.747}, {"type": "recall_at_1", "value": 14.804999999999998}, {"type": "recall_at_10", "value": 34.776}, {"type": "recall_at_100", "value": 66.279}, {"type": "recall_at_1000", "value": 89.96600000000001}, {"type": "recall_at_20", "value": 44.31}, {"type": "recall_at_3", "value": 23.623}, {"type": "recall_at_5", "value": 27.194000000000003}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB HotpotQA", "type": "mteb/hotpotqa", "config": "default", "split": "test", "revision": "ab518f4d6fcca38d87c25209f94beba119d02014"}, "metrics": [{"type": "map_at_1", "value": 38.555}, {"type": "map_at_10", "value": 54.20700000000001}, {"type": "map_at_100", "value": 55.177}, {"type": "map_at_1000", "value": 55.254999999999995}, {"type": "map_at_20", "value": 54.788000000000004}, {"type": "map_at_3", "value": 51.034}, {"type": "map_at_5", "value": 52.998}, {"type": "mrr_at_1", "value": 77.11}, {"type": "mrr_at_10", "value": 82.93199999999999}, {"type": "mrr_at_100", "value": 83.14200000000001}, {"type": "mrr_at_1000", "value": 83.15}, {"type": "mrr_at_20", "value": 83.062}, {"type": "mrr_at_3", "value": 81.95599999999999}, {"type": "mrr_at_5", "value": 82.586}, {"type": "ndcg_at_1", "value": 77.11}, {"type": "ndcg_at_10", "value": 63.853}, {"type": "ndcg_at_100", "value": 67.18499999999999}, {"type": "ndcg_at_1000", "value": 68.676}, {"type": "ndcg_at_20", "value": 65.279}, {"type": "ndcg_at_3", "value": 59.301}, {"type": "ndcg_at_5", "value": 61.822}, {"type": "precision_at_1", "value": 77.11}, {"type": "precision_at_10", "value": 13.044}, {"type": "precision_at_100", "value": 1.5630000000000002}, {"type": "precision_at_1000", "value": 0.17600000000000002}, {"type": "precision_at_20", "value": 6.979}, {"type": "precision_at_3", "value": 36.759}, {"type": "precision_at_5", "value": 24.054000000000002}, {"type": "recall_at_1", "value": 38.555}, {"type": "recall_at_10", "value": 65.21900000000001}, {"type": "recall_at_100", "value": 78.16300000000001}, {"type": "recall_at_1000", "value": 88.02799999999999}, {"type": "recall_at_20", "value": 69.791}, {"type": "recall_at_3", "value": 55.138}, {"type": "recall_at_5", "value": 60.135000000000005}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ImdbClassification", "type": "mteb/imdb", "config": "default", "split": "test", "revision": "3d86128a09e091d6018b6d26cad27f2739fc2db7"}, "metrics": [{"type": "accuracy", "value": 69.8728}, {"type": "ap", "value": 63.98214492125858}, {"type": "f1", "value": 69.59975497754624}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPDomainClassification", "type": "mteb/mtop_domain", "config": "default", "split": "test", "revision": "d80d48c1eb48d3562165c59d59d0034df9fff0bf"}, "metrics": [{"type": "accuracy", "value": 94.76288189694483}, {"type": "f1", "value": 94.52150972672682}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPIntentClassification", "type": "mteb/mtop_intent", "config": "default", "split": "test", "revision": "ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba"}, "metrics": [{"type": "accuracy", "value": 76.83994528043777}, {"type": "f1", "value": 57.95571154189732}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification", "type": "mteb/amazon_massive_intent", "config": "default", "split": "test", "revision": "4672e20407010da34463acc759c162ca9734bca6"}, "metrics": [{"type": "accuracy", "value": 46.1163416274378}, {"type": "f1", "value": 45.425692244093064}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification", "type": "mteb/amazon_massive_scenario", "config": "default", "split": "test", "revision": "fad2c6e8459f9e1c45d9315f4953d921437d70f8"}, "metrics": [{"type": "accuracy", "value": 45.57834566240753}, {"type": "f1", "value": 43.84840097785479}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringP2P", "type": "mteb/medrxiv-clustering-p2p", "config": "default", "split": "test", "revision": "e7a26af6f3ae46b30dde8737f02c07b1505bcc73"}, "metrics": [{"type": "v_measure", "value": 32.86396397182615}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringS2S", "type": "mteb/medrxiv-clustering-s2s", "config": "default", "split": "test", "revision": "35191c8c0dca72d8ff3efcd72aa802307d469663"}, "metrics": [{"type": "v_measure", "value": 34.018965727588565}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB MindSmallReranking", "type": "mteb/mind_small", "config": "default", "split": "test", "revision": "59042f120c80e8afa9cdbb224f67076cec0fc9a7"}, "metrics": [{"type": "map", "value": 31.286618059824573}, {"type": "mrr", "value": 32.481830769278965}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NFCorpus", "type": "mteb/nfcorpus", "config": "default", "split": "test", "revision": "ec0fa4fe99da2ff19ca1214b7966684033a58814"}, "metrics": [{"type": "map_at_1", "value": 4.236}, {"type": "map_at_10", "value": 9.352}, {"type": "map_at_100", "value": 12.382}, {"type": "map_at_1000", "value": 13.828999999999999}, {"type": "map_at_20", "value": 10.619}, {"type": "map_at_3", "value": 6.814000000000001}, {"type": "map_at_5", "value": 7.887}, {"type": "mrr_at_1", "value": 37.152}, {"type": "mrr_at_10", "value": 47.055}, {"type": "mrr_at_100", "value": 47.82}, {"type": "mrr_at_1000", "value": 47.86}, {"type": "mrr_at_20", "value": 47.605}, {"type": "mrr_at_3", "value": 44.118}, {"type": "mrr_at_5", "value": 46.115}, {"type": "ndcg_at_1", "value": 34.365}, {"type": "ndcg_at_10", "value": 28.473}, {"type": "ndcg_at_100", "value": 27.311999999999998}, {"type": "ndcg_at_1000", "value": 36.671}, {"type": "ndcg_at_20", "value": 27.137}, {"type": "ndcg_at_3", "value": 31.939}, {"type": "ndcg_at_5", "value": 30.428}, {"type": "precision_at_1", "value": 36.223}, {"type": "precision_at_10", "value": 21.858}, {"type": "precision_at_100", "value": 7.417999999999999}, {"type": "precision_at_1000", "value": 2.0709999999999997}, {"type": "precision_at_20", "value": 16.502}, {"type": "precision_at_3", "value": 30.857}, {"type": "precision_at_5", "value": 26.997}, {"type": "recall_at_1", "value": 4.236}, {"type": "recall_at_10", "value": 13.489}, {"type": "recall_at_100", "value": 29.580000000000002}, {"type": "recall_at_1000", "value": 62.726000000000006}, {"type": "recall_at_20", "value": 18.346999999999998}, {"type": "recall_at_3", "value": 7.811}, {"type": "recall_at_5", "value": 10.086}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NQ", "type": "mteb/nq", "config": "default", "split": "test", "revision": "b774495ed302d8c44a3a7ea25c90dbce03968f31"}, "metrics": [{"type": "map_at_1", "value": 21.123}, {"type": "map_at_10", "value": 34.429}, {"type": "map_at_100", "value": 35.803000000000004}, {"type": "map_at_1000", "value": 35.853}, {"type": "map_at_20", "value": 35.308}, {"type": "map_at_3", "value": 30.095}, {"type": "map_at_5", "value": 32.435}, {"type": "mrr_at_1", "value": 23.841}, {"type": "mrr_at_10", "value": 36.864999999999995}, {"type": "mrr_at_100", "value": 37.935}, {"type": "mrr_at_1000", "value": 37.97}, {"type": "mrr_at_20", "value": 37.566}, {"type": "mrr_at_3", "value": 32.918}, {"type": "mrr_at_5", "value": 35.11}, {"type": "ndcg_at_1", "value": 23.841}, {"type": "ndcg_at_10", "value": 42.043}, {"type": "ndcg_at_100", "value": 48.015}, {"type": "ndcg_at_1000", "value": 49.152}, {"type": "ndcg_at_20", "value": 44.936}, {"type": "ndcg_at_3", "value": 33.513999999999996}, {"type": "ndcg_at_5", "value": 37.541999999999994}, {"type": "precision_at_1", "value": 23.841}, {"type": "precision_at_10", "value": 7.454}, {"type": "precision_at_100", "value": 1.081}, {"type": "precision_at_1000", "value": 0.11900000000000001}, {"type": "precision_at_20", "value": 4.413}, {"type": "precision_at_3", "value": 15.672}, {"type": "precision_at_5", "value": 11.657}, {"type": "recall_at_1", "value": 21.123}, {"type": "recall_at_10", "value": 63.096}, {"type": "recall_at_100", "value": 89.27199999999999}, {"type": "recall_at_1000", "value": 97.69}, {"type": "recall_at_20", "value": 73.873}, {"type": "recall_at_3", "value": 40.588}, {"type": "recall_at_5", "value": 49.928}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB QuoraRetrieval", "type": "mteb/quora", "config": "default", "split": "test", "revision": "e4e08e0b7dbe3c8700f0daef558ff32256715259"}, "metrics": [{"type": "map_at_1", "value": 70.255}, {"type": "map_at_10", "value": 84.387}, {"type": "map_at_100", "value": 85.027}, {"type": "map_at_1000", "value": 85.043}, {"type": "map_at_20", "value": 84.809}, {"type": "map_at_3", "value": 81.5}, {"type": "map_at_5", "value": 83.286}, {"type": "mrr_at_1", "value": 80.85}, {"type": "mrr_at_10", "value": 87.25699999999999}, {"type": "mrr_at_100", "value": 87.363}, {"type": "mrr_at_1000", "value": 87.363}, {"type": "mrr_at_20", "value": 87.336}, {"type": "mrr_at_3", "value": 86.357}, {"type": "mrr_at_5", "value": 86.939}, {"type": "ndcg_at_1", "value": 80.86}, {"type": "ndcg_at_10", "value": 88.151}, {"type": "ndcg_at_100", "value": 89.381}, {"type": "ndcg_at_1000", "value": 89.47800000000001}, {"type": "ndcg_at_20", "value": 88.82100000000001}, {"type": "ndcg_at_3", "value": 85.394}, {"type": "ndcg_at_5", "value": 86.855}, {"type": "precision_at_1", "value": 80.86}, {"type": "precision_at_10", "value": 13.397}, {"type": "precision_at_100", "value": 1.5310000000000001}, {"type": "precision_at_1000", "value": 0.157}, {"type": "precision_at_20", "value": 7.106999999999999}, {"type": "precision_at_3", "value": 37.46}, {"type": "precision_at_5", "value": 24.568}, {"type": "recall_at_1", "value": 70.255}, {"type": "recall_at_10", "value": 95.405}, {"type": "recall_at_100", "value": 99.56}, {"type": "recall_at_1000", "value": 99.98599999999999}, {"type": "recall_at_20", "value": 97.544}, {"type": "recall_at_3", "value": 87.414}, {"type": "recall_at_5", "value": 91.598}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClustering", "type": "mteb/reddit-clustering", "config": "default", "split": "test", "revision": "24640382cdbf8abc73003fb0fa6d111a705499eb"}, "metrics": [{"type": "v_measure", "value": 54.7557403999403}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClusteringP2P", "type": "mteb/reddit-clustering-p2p", "config": "default", "split": "test", "revision": "385e3cb46b4cfa89021f56c4380204149d0efe33"}, "metrics": [{"type": "v_measure", "value": 56.2773308957202}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SCIDOCS", "type": "mteb/scidocs", "config": "default", "split": "test", "revision": "f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88"}, "metrics": [{"type": "map_at_1", "value": 4.123}, {"type": "map_at_10", "value": 9.940999999999999}, {"type": "map_at_100", "value": 11.928999999999998}, {"type": "map_at_1000", "value": 12.257}, {"type": "map_at_20", "value": 10.866000000000001}, {"type": "map_at_3", "value": 7.091}, {"type": "map_at_5", "value": 8.393}, {"type": "mrr_at_1", "value": 20.3}, {"type": "mrr_at_10", "value": 30.068}, {"type": "mrr_at_100", "value": 31.296000000000003}, {"type": "mrr_at_1000", "value": 31.36}, {"type": "mrr_at_20", "value": 30.756}, {"type": "mrr_at_3", "value": 26.667}, {"type": "mrr_at_5", "value": 28.616999999999997}, {"type": "ndcg_at_1", "value": 20.3}, {"type": "ndcg_at_10", "value": 17.305}, {"type": "ndcg_at_100", "value": 25.529000000000003}, {"type": "ndcg_at_1000", "value": 31.41}, {"type": "ndcg_at_20", "value": 19.967}, {"type": "ndcg_at_3", "value": 16.022}, {"type": "ndcg_at_5", "value": 14.12}, {"type": "precision_at_1", "value": 20.3}, {"type": "precision_at_10", "value": 9.06}, {"type": "precision_at_100", "value": 2.103}, {"type": "precision_at_1000", "value": 0.35200000000000004}, {"type": "precision_at_20", "value": 6.075}, {"type": "precision_at_3", "value": 14.832999999999998}, {"type": "precision_at_5", "value": 12.36}, {"type": "recall_at_1", "value": 4.123}, {"type": "recall_at_10", "value": 18.383}, {"type": "recall_at_100", "value": 42.67}, {"type": "recall_at_1000", "value": 71.44800000000001}, {"type": "recall_at_20", "value": 24.64}, {"type": "recall_at_3", "value": 9.043}, {"type": "recall_at_5", "value": 12.543000000000001}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB SICK-R", "type": "mteb/sickr-sts", "config": "default", "split": "test", "revision": "20a6d6f312dd54037fe07a32d58e5e168867909d"}, "metrics": [{"type": "cos_sim_pearson", "value": 84.37101718384514}, {"type": "cos_sim_spearman", "value": 80.73657031880697}, {"type": "euclidean_pearson", "value": 81.42351850520845}, {"type": "euclidean_spearman", "value": 80.81452496851979}, {"type": "manhattan_pearson", "value": 81.47676252115669}, {"type": "manhattan_spearman", "value": 80.87566944708885}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS12", "type": "mteb/sts12-sts", "config": "default", "split": "test", "revision": "a0d554a64d88156834ff5ae9920b964011b16384"}, "metrics": [{"type": "cos_sim_pearson", "value": 84.79559176971591}, {"type": "cos_sim_spearman", "value": 75.41866597445552}, {"type": "euclidean_pearson", "value": 83.20287101163838}, {"type": "euclidean_spearman", "value": 75.54564777571143}, {"type": "manhattan_pearson", "value": 83.24622548900163}, {"type": "manhattan_spearman", "value": 75.63826258190343}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS13", "type": "mteb/sts13-sts", "config": "default", "split": "test", "revision": "7e90230a92c190f1bf69ae9002b8cea547a64cca"}, "metrics": [{"type": "cos_sim_pearson", "value": 84.63322096299294}, {"type": "cos_sim_spearman", "value": 85.48272638914783}, {"type": "euclidean_pearson", "value": 85.57327707819331}, {"type": "euclidean_spearman", "value": 85.90735298172922}, {"type": "manhattan_pearson", "value": 85.5744191274933}, {"type": "manhattan_spearman", "value": 85.90828008488766}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS14", "type": "mteb/sts14-sts", "config": "default", "split": "test", "revision": "6031580fec1f6af667f0bd2da0a551cf4f0b2375"}, "metrics": [{"type": "cos_sim_pearson", "value": 82.05530140566407}, {"type": "cos_sim_spearman", "value": 78.85454907951474}, {"type": "euclidean_pearson", "value": 81.4307311680376}, {"type": "euclidean_spearman", "value": 78.99131623529348}, {"type": "manhattan_pearson", "value": 81.46870892683134}, {"type": "manhattan_spearman", "value": 79.05473823658481}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS15", "type": "mteb/sts15-sts", "config": "default", "split": "test", "revision": "ae752c7c21bf194d8b67fd573edf7ae58183cbe3"}, "metrics": [{"type": "cos_sim_pearson", "value": 83.66620817683379}, {"type": "cos_sim_spearman", "value": 85.23347998035328}, {"type": "euclidean_pearson", "value": 84.59001637865366}, {"type": "euclidean_spearman", "value": 85.0081410316597}, {"type": "manhattan_pearson", "value": 84.59742325369818}, {"type": "manhattan_spearman", "value": 85.01721329704324}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS16", "type": "mteb/sts16-sts", "config": "default", "split": "test", "revision": "4d8694f8f0e0100860b497b999b3dbed754a0513"}, "metrics": [{"type": "cos_sim_pearson", "value": 79.86344730144208}, {"type": "cos_sim_spearman", "value": 82.15966778685441}, {"type": "euclidean_pearson", "value": 81.85580574642779}, {"type": "euclidean_spearman", "value": 82.06482873417123}, {"type": "manhattan_pearson", "value": 81.82971046102377}, {"type": "manhattan_spearman", "value": 82.04185436355144}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS17", "type": "mteb/sts17-crosslingual-sts", "config": "default", "split": "test", "revision": "faeb762787bd10488a50c8b5be4a3b82e411949c"}, "metrics": [{"type": "cos_sim_pearson", "value": 31.440481026661672}, {"type": "cos_sim_spearman", "value": 31.592743544965913}, {"type": "euclidean_pearson", "value": 31.15111049327518}, {"type": "euclidean_spearman", "value": 30.555124184361464}, {"type": "manhattan_pearson", "value": 31.724139249295654}, {"type": "manhattan_spearman", "value": 30.483389245793504}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22", "type": "mteb/sts22-crosslingual-sts", "config": "default", "split": "test", "revision": "de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3"}, "metrics": [{"type": "cos_sim_pearson", "value": 34.51489724275415}, {"type": "cos_sim_spearman", "value": 47.06532141601629}, {"type": "euclidean_pearson", "value": 33.28904737503036}, {"type": "euclidean_spearman", "value": 45.111172981641865}, {"type": "manhattan_pearson", "value": 33.36374172942392}, {"type": "manhattan_spearman", "value": 45.100940945158534}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSBenchmark", "type": "mteb/stsbenchmark-sts", "config": "default", "split": "test", "revision": "b0fddb56ed78048fa8b90373c8a3cfc37b684831"}, "metrics": [{"type": "cos_sim_pearson", "value": 82.09996292950329}, {"type": "cos_sim_spearman", "value": 82.69376206796092}, {"type": "euclidean_pearson", "value": 82.83254956369134}, {"type": "euclidean_spearman", "value": 82.34202999843637}, {"type": "manhattan_pearson", "value": 82.8048494319632}, {"type": "manhattan_spearman", "value": 82.34713123336984}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB SciDocsRR", "type": "mteb/scidocs-reranking", "config": "default", "split": "test", "revision": "d3c5e1fc0b855ab6097bf1cda04dd73947d7caab"}, "metrics": [{"type": "map", "value": 82.1402269601644}, {"type": "mrr", "value": 94.84447197682492}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SciFact", "type": "mteb/scifact", "config": "default", "split": "test", "revision": "0228b52cf27578f30900b9e5271d331663a030d7"}, "metrics": [{"type": "map_at_1", "value": 49.138999999999996}, {"type": "map_at_10", "value": 60.288}, {"type": "map_at_100", "value": 61.082}, {"type": "map_at_1000", "value": 61.11}, {"type": "map_at_20", "value": 60.831999999999994}, {"type": "map_at_3", "value": 57.106}, {"type": "map_at_5", "value": 58.857000000000006}, {"type": "mrr_at_1", "value": 51.333}, {"type": "mrr_at_10", "value": 61.364}, {"type": "mrr_at_100", "value": 62.029999999999994}, {"type": "mrr_at_1000", "value": 62.056}, {"type": "mrr_at_20", "value": 61.85000000000001}, {"type": "mrr_at_3", "value": 58.721999999999994}, {"type": "mrr_at_5", "value": 60.221999999999994}, {"type": "ndcg_at_1", "value": 51.333}, {"type": "ndcg_at_10", "value": 65.71900000000001}, {"type": "ndcg_at_100", "value": 69.036}, {"type": "ndcg_at_1000", "value": 69.626}, {"type": "ndcg_at_20", "value": 67.571}, {"type": "ndcg_at_3", "value": 60.019}, {"type": "ndcg_at_5", "value": 62.733000000000004}, {"type": "precision_at_1", "value": 51.333}, {"type": "precision_at_10", "value": 9.067}, {"type": "precision_at_100", "value": 1.083}, {"type": "precision_at_1000", "value": 0.11299999999999999}, {"type": "precision_at_20", "value": 4.95}, {"type": "precision_at_3", "value": 23.889}, {"type": "precision_at_5", "value": 16.0}, {"type": "recall_at_1", "value": 49.138999999999996}, {"type": "recall_at_10", "value": 81.256}, {"type": "recall_at_100", "value": 95.6}, {"type": "recall_at_1000", "value": 100.0}, {"type": "recall_at_20", "value": 88.289}, {"type": "recall_at_3", "value": 66.078}, {"type": "recall_at_5", "value": 72.661}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB SprintDuplicateQuestions", "type": "mteb/sprintduplicatequestions-pairclassification", "config": "default", "split": "test", "revision": "d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46"}, "metrics": [{"type": "cos_sim_accuracy", "value": 99.73762376237623}, {"type": "cos_sim_ap", "value": 93.02149432690442}, {"type": "cos_sim_f1", "value": 86.59079663532904}, {"type": "cos_sim_precision", "value": 85.70029382957884}, {"type": "cos_sim_recall", "value": 87.5}, {"type": "dot_accuracy", "value": 99.73267326732673}, {"type": "dot_ap", "value": 92.38661051842968}, {"type": "dot_f1", "value": 85.92283628779978}, {"type": "dot_precision", "value": 89.76034858387798}, {"type": "dot_recall", "value": 82.39999999999999}, {"type": "euclidean_accuracy", "value": 99.73960396039604}, {"type": "euclidean_ap", "value": 92.99557708360517}, {"type": "euclidean_f1", "value": 86.49183572488866}, {"type": "euclidean_precision", "value": 85.60235063663075}, {"type": "euclidean_recall", "value": 87.4}, {"type": "manhattan_accuracy", "value": 99.74059405940594}, {"type": "manhattan_ap", "value": 93.24237279644005}, {"type": "manhattan_f1", "value": 86.77727501256913}, {"type": "manhattan_precision", "value": 87.25985844287159}, {"type": "manhattan_recall", "value": 86.3}, {"type": "max_accuracy", "value": 99.74059405940594}, {"type": "max_ap", "value": 93.24237279644005}, {"type": "max_f1", "value": 86.77727501256913}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClustering", "type": "mteb/stackexchange-clustering", "config": "default", "split": "test", "revision": "6cbc1f7b2bc0622f2e39d2c77fa502909748c259"}, "metrics": [{"type": "v_measure", "value": 63.94924261127149}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClusteringP2P", "type": "mteb/stackexchange-clustering-p2p", "config": "default", "split": "test", "revision": "815ca46b2622cec33ccafc3735d572c266efdb44"}, "metrics": [{"type": "v_measure", "value": 32.22297034902405}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB StackOverflowDupQuestions", "type": "mteb/stackoverflowdupquestions-reranking", "config": "default", "split": "test", "revision": "e185fbe320c72810689fc5848eb6114e1ef5ec69"}, "metrics": [{"type": "map", "value": 46.12948438780115}, {"type": "mrr", "value": 46.77186783804431}]}, {"task": {"type": "Summarization"}, "dataset": {"name": "MTEB SummEval", "type": "mteb/summeval", "config": "default", "split": "test", "revision": "cda12ad7615edc362dbf25a00fdd61d3b1eaf93c"}, "metrics": [{"type": "cos_sim_pearson", "value": 30.02235612863601}, {"type": "cos_sim_spearman", "value": 30.567504287706598}, {"type": "dot_pearson", "value": 28.943978981614897}, {"type": "dot_spearman", "value": 29.905635915797358}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB TRECCOVID", "type": "mteb/trec-covid", "config": "default", "split": "test", "revision": "bb9466bac8153a0349341eb1b22e06409e78ef4e"}, "metrics": [{"type": "map_at_1", "value": 0.173}, {"type": "map_at_10", "value": 1.124}, {"type": "map_at_100", "value": 5.645}, {"type": "map_at_1000", "value": 14.965}, {"type": "map_at_20", "value": 1.876}, {"type": "map_at_3", "value": 0.45599999999999996}, {"type": "map_at_5", "value": 0.699}, {"type": "mrr_at_1", "value": 70.0}, {"type": "mrr_at_10", "value": 81.786}, {"type": "mrr_at_100", "value": 81.786}, {"type": "mrr_at_1000", "value": 81.786}, {"type": "mrr_at_20", "value": 81.786}, {"type": "mrr_at_3", "value": 80.0}, {"type": "mrr_at_5", "value": 81.5}, {"type": "ndcg_at_1", "value": 65.0}, {"type": "ndcg_at_10", "value": 53.88699999999999}, {"type": "ndcg_at_100", "value": 38.028}, {"type": "ndcg_at_1000", "value": 37.183}, {"type": "ndcg_at_20", "value": 49.286}, {"type": "ndcg_at_3", "value": 63.05}, {"type": "ndcg_at_5", "value": 59.49100000000001}, {"type": "precision_at_1", "value": 70.0}, {"type": "precision_at_10", "value": 55.400000000000006}, {"type": "precision_at_100", "value": 38.800000000000004}, {"type": "precision_at_1000", "value": 17.082}, {"type": "precision_at_20", "value": 50.7}, {"type": "precision_at_3", "value": 66.667}, {"type": "precision_at_5", "value": 62.4}, {"type": "recall_at_1", "value": 0.173}, {"type": "recall_at_10", "value": 1.353}, {"type": "recall_at_100", "value": 8.887}, {"type": "recall_at_1000", "value": 36.012}, {"type": "recall_at_20", "value": 2.476}, {"type": "recall_at_3", "value": 0.508}, {"type": "recall_at_5", "value": 0.795}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB Touche2020", "type": "mteb/touche2020", "config": "default", "split": "test", "revision": "a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f"}, "metrics": [{"type": "map_at_1", "value": 2.614}, {"type": "map_at_10", "value": 6.651999999999999}, {"type": "map_at_100", "value": 11.59}, {"type": "map_at_1000", "value": 13.044}, {"type": "map_at_20", "value": 8.702}, {"type": "map_at_3", "value": 4.159}, {"type": "map_at_5", "value": 5.327}, {"type": "mrr_at_1", "value": 30.612000000000002}, {"type": "mrr_at_10", "value": 42.664}, {"type": "mrr_at_100", "value": 43.957}, {"type": "mrr_at_1000", "value": 43.957}, {"type": "mrr_at_20", "value": 43.193}, {"type": "mrr_at_3", "value": 40.476}, {"type": "mrr_at_5", "value": 42.007}, {"type": "ndcg_at_1", "value": 27.551}, {"type": "ndcg_at_10", "value": 18.098}, {"type": "ndcg_at_100", "value": 30.019000000000002}, {"type": "ndcg_at_1000", "value": 42.179}, {"type": "ndcg_at_20", "value": 19.552}, {"type": "ndcg_at_3", "value": 21.22}, {"type": "ndcg_at_5", "value": 19.774}, {"type": "precision_at_1", "value": 30.612000000000002}, {"type": "precision_at_10", "value": 15.101999999999999}, {"type": "precision_at_100", "value": 6.510000000000001}, {"type": "precision_at_1000", "value": 1.4569999999999999}, {"type": "precision_at_20", "value": 12.449}, {"type": "precision_at_3", "value": 22.448999999999998}, {"type": "precision_at_5", "value": 19.592000000000002}, {"type": "recall_at_1", "value": 2.614}, {"type": "recall_at_10", "value": 11.068}, {"type": "recall_at_100", "value": 42.317}, {"type": "recall_at_1000", "value": 79.063}, {"type": "recall_at_20", "value": 18.589}, {"type": "recall_at_3", "value": 5.06}, {"type": "recall_at_5", "value": 7.356}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ToxicConversationsClassification", "type": "mteb/toxic_conversations_50k", "config": "default", "split": "test", "revision": "edfaf9da55d3dd50d43143d90c1ac476895ae6de"}, "metrics": [{"type": "accuracy", "value": 75.0146484375}, {"type": "ap", "value": 16.80191476928431}, {"type": "f1", "value": 58.08037205204817}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB TweetSentimentExtractionClassification", "type": "mteb/tweet_sentiment_extraction", "config": "default", "split": "test", "revision": "d604517c81ca91fe16a244d1248fc021f9ecee7a"}, "metrics": [{"type": "accuracy", "value": 61.80249009620826}, {"type": "f1", "value": 62.24155926661914}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB TwentyNewsgroupsClustering", "type": "mteb/twentynewsgroups-clustering", "config": "default", "split": "test", "revision": "6125ec4e24fa026cec8a478383ee943acfbd5449"}, "metrics": [{"type": "v_measure", "value": 47.074846780747094}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterSemEval2015", "type": "mteb/twittersemeval2015-pairclassification", "config": "default", "split": "test", "revision": "70970daeab8776df92f5ea462b6173c0b46fd2d1"}, "metrics": [{"type": "cos_sim_accuracy", "value": 85.21785778148656}, {"type": "cos_sim_ap", "value": 71.06584074764645}, {"type": "cos_sim_f1", "value": 65.81720166625826}, {"type": "cos_sim_precision", "value": 61.43641354071363}, {"type": "cos_sim_recall", "value": 70.87071240105541}, {"type": "dot_accuracy", "value": 84.30589497526375}, {"type": "dot_ap", "value": 68.85872202019365}, {"type": "dot_f1", "value": 64.20295157946092}, {"type": "dot_precision", "value": 59.69607620775687}, {"type": "dot_recall", "value": 69.44591029023746}, {"type": "euclidean_accuracy", "value": 85.21189724026942}, {"type": "euclidean_ap", "value": 71.18847194129523}, {"type": "euclidean_f1", "value": 66.00049962528105}, {"type": "euclidean_precision", "value": 62.66603415559773}, {"type": "euclidean_recall", "value": 69.70976253298153}, {"type": "manhattan_accuracy", "value": 85.25958157000656}, {"type": "manhattan_ap", "value": 71.12967638566641}, {"type": "manhattan_f1", "value": 65.77477594492791}, {"type": "manhattan_precision", "value": 64.77359938603223}, {"type": "manhattan_recall", "value": 66.80738786279683}, {"type": "max_accuracy", "value": 85.25958157000656}, {"type": "max_ap", "value": 71.18847194129523}, {"type": "max_f1", "value": 66.00049962528105}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterURLCorpus", "type": "mteb/twitterurlcorpus-pairclassification", "config": "default", "split": "test", "revision": "8b6510b0b1fa4e4c4f879467980e9be563ec1cdf"}, "metrics": [{"type": "cos_sim_accuracy", "value": 88.22330888345559}, {"type": "cos_sim_ap", "value": 84.40304506741951}, {"type": "cos_sim_f1", "value": 76.46823520855303}, {"type": "cos_sim_precision", "value": 72.45537867824409}, {"type": "cos_sim_recall", "value": 80.95164767477672}, {"type": "dot_accuracy", "value": 87.9400007761866}, {"type": "dot_ap", "value": 83.63499141834609}, {"type": "dot_f1", "value": 75.98620939938304}, {"type": "dot_precision", "value": 71.86792064254823}, {"type": "dot_recall", "value": 80.60517400677548}, {"type": "euclidean_accuracy", "value": 88.21166608452671}, {"type": "euclidean_ap", "value": 84.40463988450605}, {"type": "euclidean_f1", "value": 76.52312831312177}, {"type": "euclidean_precision", "value": 72.40621135083138}, {"type": "euclidean_recall", "value": 81.13643363104404}, {"type": "manhattan_accuracy", "value": 88.24659448131331}, {"type": "manhattan_ap", "value": 84.42287495905447}, {"type": "manhattan_f1", "value": 76.54849595413475}, {"type": "manhattan_precision", "value": 72.39036442248302}, {"type": "manhattan_recall", "value": 81.21342777948875}, {"type": "max_accuracy", "value": 88.24659448131331}, {"type": "max_ap", "value": 84.42287495905447}, {"type": "max_f1", "value": 76.54849595413475}]}]}]} |
vidhi0206/setfit-paraphrase-mpnet-base-v2 | vidhi0206 | text-classification | [
"setfit",
"safetensors",
"mpnet",
"sentence-transformers",
"text-classification",
"generated_from_setfit_trainer",
"arxiv:2209.11055",
"base_model:sentence-transformers/paraphrase-mpnet-base-v2",
"base_model:finetune:sentence-transformers/paraphrase-mpnet-base-v2",
"model-index",
"region:us"
]
| 2024-01-15T10:07:50 | 2024-02-14T21:46:15 | 3 | 0 | ---
base_model: sentence-transformers/paraphrase-mpnet-base-v2
library_name: setfit
metrics:
- accuracy
pipeline_tag: text-classification
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
widget:
- text: 'versace art portfolio up for sale the art collection of murdered fashion
designer gianni versace could fetch up to £9m ($17m) when it is auctioned in new
york and london later this year. among the pictures for sale are works by roy
lichtenstein andy warhol and henri matisse. the collection was housed at versace
s six-storey new york townhouse. the 51-year-old designer was shot outside his
florida home in 1997 by suspected serial killer andrew cunanan who later killed
himself. the auction at sotheby s will feature 45 contemporary impressionist
and 19th century paintings. one of the highlights of the sale is roy lichtenstein
s blue nude which has been given an estimate of £1.8m ($3.4m). tobias meyer sotheby
s worldwide head of contemporary art said: this collection reflects mr versace
s wide-ranging taste and impeccable eye and many of the works were commissioned
directly from the artists. outstanding later examples from champions of the pop
movement such as roy lichtenstein are juxtaposed with masterpieces from the
most visible artists of the 1980 s including jean-michel basquiat and the collaborative
genius of basquiat and warhol as well as francesco clemente. much of the collection
will be offered for sale at three auctions in new york in june with smaller contemporary
paintings going under the hammer in london on 22 and 23 june. a sale of versace
s furniture and artworks sold in 2001fetched £5.5m ($10.3m).'
- text: 'councils prepare to set tax rises council tax in scotland is set to rise
by an average of about 4% in the coming year bbc scotland has learned. authorities
will decide final figures on thursday when projected increases will be more than
twice the rate of inflation which is currently 1.6%. the finance minister has
urged councils to limit increases but they have warned that they will struggle
to maintain services unless funding is increased. they say much additional government
money is for new initiatives. scottish finance minister tom mccabe msp said: last
week in parliament i announced an additional £419m for core expenditure to local
government in scotland. that s a 5.5% increase and sits against an inflation
rate of 1.6% so i think we have quite rightly said to councils this year that
we would at the very least ask them to exercise restraint. mr mccabe is also
looking for local authorities to become more efficient and save money in coming
years. he told bbc radio scotland s sunday live programme: here in scotland we
have 32 councils who all have their own individual collection systems for council
tax they have their own payroll systems and their own human resource systems. we
think there has to be opportunities there for rationalisation and using the money
saved to reinvest in frontline services. the councils umbrella organisation
cosla which provided bbc scotland with the indicative figures for next year warned
that councils would face a continuous struggle to maintain services. mr mccabe
has promised them about £8.1bn next year. however most of the increase is targeted
to new initiatives and councils will experience difficulties in maintaining core
services a cosla spokesman said. cosla says that it is willing to work with
the executive on finding efficiency savings but that these will not be enough
to maintain services. they say the funding plans for the next three years will
see councils lose more of the share of public spending. the conservatives accuse
the scottish executive of using the council tax to raise funds because it is too
afraid to raise income tax. the tory finance spokesman brian monteith msp said: its
a form of disguise... yet again we see that council tax is being used as a way
of passing on costs. scared of actually using its three pence income tax that
it could put up what we ve seen over the years is more and more burdens being
put onto local authorities and the council tax payer having to pick up the bill. there
are also warnings that unless funding to councils is increased in the next few
years then services may have to be reduced. linda knox director of the scottish
local authority management centre at strathclyde university said: with this
current settlement the increase is slowing. at the same time the burdens on councils
are greater than they were. the settlement figures don t include pay increases
and the executive is also requiring a substantial figure - in the area of £325m
- in efficiency savings across the settlement period. education will be protected
from any cuts but linda knox says this will mean other services will suffer. she
said: in practice that will mean a 4-5% cut for other services. on the face
of it the settlement looks like an increase of about 9.7% but by the time you
take into account other factors its probably only about 1% in real terms.'
- text: gadget show heralds mp3 christmas partners of those who love their hi-tech
gear may want to get their presents in early as experts predict a gadget shortage
this christmas. with apple s ipod topping wish lists again there may not be
enough ipod minis to go round predicts oliver irish editor of gadget magazine
stuff. the ipod mini is likely to be this year s tracey island said mr irish.
stuff has compiled a list of the top 10 gadgets for 2004 and the ipod is at number
one. for anyone bewildered by the choice of gadgets on the market stuff and
what hi-fi are hosting a best-of gadget show in london this weekend. star of
the show will be sony s qrio robot an all-singing all-dancing football-playing
man-machine who can even hold intelligent conversations. but he is not for sale
and sony has no commercial plans for the robot. he will greet visitors and is
flying in from japan. he probably has his own airplane seat that is how highly
sony prize him said mr irish. also on display will be a virtual keyboard which
projects itself onto any flat surface. the event will play host to a large collection
of digital music players from companies such as creative sony and philips as
well as the ubiquitously fashionable ipod from apple. suggestions that it could
be a gaming or wireless christmas are unlikely to come true as mp3 players remain
the most popular stocking filler said mr irish. demand is huge and apple has
promised that it can supply enough but people might struggle to get their hands
on ipod minis said mr irish. for those who like their gadgets to be multi-talented the
gizmondo a powerful gaming console with gps and gprs that also doubles up as
an mp3 player movie player and camera could be a must-have. what is impressive
is how much it can do and how well it can do them said mr irish. this christmas gadgets
will not be an all-male preserve. women will be getting gadgets from husbands
and boyfriends as well as buying them for themselves said mr irish. gadgets
nowadays are lifestyle products rather than just for geeks.
- text: 'virus poses as christmas e-mail security firms are warning about a windows
virus disguising itself as an electronic christmas card. the zafi.d virus translates
the christmas greeting on its subject line into the language of the person receiving
infected e-mail. anti-virus firms speculate that this multilingual ability is
helping the malicious program spread widely online. anti-virus firm sophos said
that 10% of the e-mail currently on the net was infected with the zafi virus. like
many other windows viruses zafi-d plunders microsoft outlook for e-mail addresses
and then uses mail-sending software to despatch itself across the web to new victims.
to be infected users must open up the attachment travelling with the message which
bears the code for the malicious bug. the attachment on the e-mail poses as an
electronic christmas card but anyone opening it will simply get a crude image
of two smiley faces. the virus subject line says merry christmas and translates
this into one of 15 languages depending of the final suffix of the e-mail address
the infected message has been sent to. the message in the body of the e-mail reads: happy
holidays and this too is translated. on infected machines the virus tries to
disable anti-virus and firewall software and opens up a backdoor on the pc to
hand over control to the writer of the virus. the virus is thought to have spread
most widely in south america italy spain bulgaria and hungary. the original
zafi virus appeared in april this year. we have seen these hoaxes for several
christmases already and personally i prefer traditional pen and paper cards and
we recommend this to all our clients too said mikko hypponen who heads f-secure
s anti-virus team.'
- text: desailly backs blues revenge trip marcel desailly insists there is no chance
of history repeating itself when chelsea take on barcelona on wednesday. the
french star was part of the chelsea side crushed 5-1 at the nou camp in the champions
league quarter-final second leg in 2000. things will be totally different this
time he told bbc sport. now everyone knows about chelsea and is a little bit
afraid of them. they are one of the major clubs in europe and the pressure will
be on barcelona. chelsea have not played barcelona since that quarter-final tie
five years ago. the blues had looked destined to progress after winning the first
leg at stamford bridge 3-1 courtesy of two goals from tore andre flo and one
by gianfranco zola. but they collapsed in the second leg going down to strikes
from rivaldo (2) luis figo dani and patrick kluivert. former chelsea captain
desailly who is now playing for al-gharafa in qatar says there is no comparison
between that side and the current blues team who are top of the premiership. mentally
they are much stronger even though a lot of their players are young the 36-year-old
said. we made some mistakes at the nou camp in 2000 - a lot of them were individual
mistakes. it would not happen now. this team has a new motivation and a different
mentality. world cup winner desailly saw huge changes during his time at stamford
bridge. he was signed for £4.6m from ac milan in 1998 by ruud gullit and went
on to play under gianluca vialli and claudio ranieri. but the biggest change occurred
when billionaire roman abramovich bought the club in 2003. desailly says the russian
s arrival helped to instil a winning mentality at the club as well as a demand
for success. the whole of chelsea is different now - the chairman the manager
and all the players he said. everything is new and there is a huge determination
to win. since that game in 2000 chelsea have gained more experience in europe
and were very close to reaching the champions league final last season. desailly
is one of the most decorated players in the history of football. he won the 1998
world cup and 2000 european championship with france the champions league in
1993 with marseilles and 1994 with ac milan two serie a titles and the fa cup
in 2000 with chelsea. he is now winding down his career in qatar alongside the
likes of frank lebeouf josep guardiola titi camara gabriel batistuta and christophe
dugarry. so he is full of admiration for two of his colleagues from the great
milan side of the mid-90s who are likely to line up against manchester united
on wednesday - paolo maldini and alessandro costacurta. i m happy that they have
managed to play so long at a high level he said. i made a vow to costacurta
that as long as he plays i will continue to play. and it s amazing that paolo
has managed to play at such a high level for such a long time.
inference: true
model-index:
- name: SetFit with sentence-transformers/paraphrase-mpnet-base-v2
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: accuracy
value: 0.953
name: Accuracy
---
# SetFit with sentence-transformers/paraphrase-mpnet-base-v2
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2)
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
- **Maximum Sequence Length:** 512 tokens
- **Number of Classes:** 5 classes
<!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
### Model Labels
| Label | Examples |
|:------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 4 | <ul><li>'peace demo appeal rejected peace protestors have lost a landmark appeal over police actions in stopping an anti-war demonstration days after the start of the iraq war. they had appealed against a high court decision that it was not unlawful for police to forcibly turn protestors away near raf fairford glos in 2003. the police had also sought to overturn a breach of human rights ruling arising from the same case. sitting on wednesday three appeal court judges dismissed both appeals. they were challenging decisions by two judges in the high court in february this year. it followed action by police when three coachloads of people were searched and detained on the way to raf fairford and forced to return to london under police escort. the demonstrators appealed against a finding by lord justice may and mr justice harrison that it was not unlawful for the police to turn the passengers away. the police were urging lord chief justice and lord justices clarke and rix to overturn the ruling that they had breached the protestors human rights by detaining them in the coaches. craig mackey assistant chief constable of gloucestershire police said: we have always considered that our responses were proportionate and all our decisions on the day were based on intelligence. he said no one on the coaches accepted responsibility for items found on the coaches including body armour a smoke bomb and five shields. given these circumstances and the fact that raf fairford and other military installations in the uk had been the scene of increasingly destructive disorder in the weeks preceding this incident the police commander on the ground made the decision to turn back the coaches. from day one we have vigorously defended this decision which was made out of a genuine concern that if the coaches were allowed to proceed it would have resulted in disorder and criminal damage at raf fairford. fairford coach action representing more than 80 people who appealed against the police actions say they are prepared to take their case to the european court of human rights. their action is supported by amnesty international and liberty.'</li><li>'kilroy launches veritas party ex-bbc chat show host and east midlands mep robert kilroy-silk has said he wants to change the face of british politics as he launched his new party. mr kilroy-silk who recently quit the uk independence party said our country was being stolen from us by mass immigration. he told a london news conference that veritas - latin for truth - would avoid the old parties lies and spin . ukip leader roger knapman says he was glad to see the back of mr kilroy-silk. mr kilroy-silk promised a firm but fair policy on immigration and said they hoped to contest most seats at the forthcoming general election. he said veritas would also announce detailed policies on crime tax pensions health and defence over the next few weeks. and he announced the party would be holding a leadership election. on thursday he is due to announce which constituency he will run in at the next general election - that will come amid speculation he has his sights set on defence secretary geoff hoon s ashfield seat. he was joined in the new venture by one of ukip s two london assembly members damien hockney who is now veritas deputy leader. ukip s chairman petrina holdsworth has said the group will just be a parody of the party the men have left. mr kilroy-silk announced his decision to quit ukip at a public meeting in hinckley leicestershire last week. it came after months of tension as he vied unsuccessfully for the leadership of that party. he said he was ashamed to be a member of a ukip whose leadership had gone awol after the great opportunity offered by its third place at last june s european elections. while ukip has turned its back on the british people i shall not he said. i will be standing at the next general election. i shall be leading a vigorous campaign for the causes i believe in. and unlike the old parties we shall be honest open and straight. mr hockney also left ukip saying mr kilroy-silk would deliver better as the leader of a eurosceptic party. a spokesman for ukip called on mr hockney to quit the london assembly. the party asserts that mr hockney has a moral obligation if not a legal one to stand down. its leader roger knapman has said he is glad to see the back of mr kilroy-silk. he has remarkable ability to influence people but sadly after the [european] election it became clear that he was more interested in the robert kilroy-silk party than the uk independence party so it was nice knowing him now goodbye he said. ukip officials also argue mr kilroy-silk has not been straightforward in attacking the party he once wanted to lead. this is just what the europhiles pray for. as the main eurosceptic party ukip should try to resolve its differences with kilroy to show a united front and give the uk public a serious political voice against europe. having multiple parties with the same view point just splits the vote further. thank goodness that kilroy-silk has gone - now ukip at least has a chance in the election! it is very sad to see the cause of britain regaining its proper relationship with europe damaged by this split within ukip. robert kilroy-silk could have a lot to offer. instead we have a split party and a damaged cause. under the present electoral system people must work together and small parties have no hope of representation. last summer ukip achieved a major advance partly and only partly due to kilroy-silk. it is a great shame this has been dissipated in in-fighting. ukip has a wide platform of policies not just withdrawal from the eu. this kilroy-silk conveniently ignores in the comments surrounding the launch of his own party. neither the english democrats nor the new party were interested in letting him join them and take over their leadership speaks volumes. veritas is the beginning of the end for kilroy-silk. if he believes in truth and democracy then he and the two assembly members should resign and force a by-elections to stand on their own platform rather than this backdoor approach to politics of being elected for one party then defecting to another. so ukip was good enough for him to lead not good enough for him to follow! interesting that a party committed to plain speaking should have a latin name! every opinion poll points to an overwhelming anti-europe feeling in this country. kilroy-silk could be on the verge of something huge if he can broaden his appeal beyond this one issue. he is an extremely able communicator with years of political experience. we wants quality schools top hospitals clean and efficient public transport punishments that fit the crime limited asylum a purge on bureaucracy and less taxes. it needs courage and honesty two qualities sadly lacking in our politicians. kilroy-silk may just have those very qualities. recruit the right colleagues robert and your time may have come! well if you cannot get enough limelight being an ordinary mp then go out and start up your own party. it s all flash and no real policy here let s hope this is the start of both ukip and kilroy-silk slipping into obscurity. veritas the name will doom it. but perhaps i am wrong for surely all modern schoolchildren will understand it since they do still learn latin in the classroom do they not the whole essence of what rks represents is euroscepticism so explain to me how the too-twee label of veritas symbolises that'</li><li>'lib dems target first-time buyers the liberal democrats have unveiled plans to build 100 000 new affordable homes on publicly owned land. the party s scheme would allow people to buy a share in a home through a mutual home ownership trust as a way of getting onto the housing ladder. the lib dems would also encourage the conversion of existing buildings in an effort to protect greenfield sites. labour has already announced plans to help first-time buyers and the tories would extend right-to-buy schemes. all the major parties are focusing on the issue in the run-up to the election after a survey suggested first-time buyers could not afford a home in 92% of uk towns. the lib dems say their mutual homes would let people buy a share of a property usually worth about 5% of the building costs. party leader charles kennedy said the homes would be affordable because they would be built on surplus public sector land donated by central or local government. people would also only have to pay for the cost of the building and not the land he added. they would spend about 30% of their monthly salary on rent and buying extra shares in the property. when they moved house they would be able to cash in on any rise in property prices by selling their share. it would also allow councils to vary discounts to tenants given the right to buy their council homes so local needs were taken into account. mr kennedy said: mutual homes will offer people the opportunity to build up an equity stake in a home gradually investing only as much as they can afford. there are also plans to prevent high house prices forcing people out of their local communities. the kind of golden share used by the lib dems in south shropshire could be rolled out more widely. under the plan councils secure deals with developers where they keep a 1% share in a property scheme so properties cannot be sold on the open market. instead they are sold at build cost to people who the local council decides have local needs. the party says its help for first-time buyers can be funded at no extra cost to the taxpayer. but the plans involve changing the vat system which the party says often makes it too expensive to renovate existing buildings. the conservatives claimed the plans would amount to an extra tax of up to £11 000 on every new house. this is typical of lib dem hypocrisy said tory shadow local government secretary caroline spelman. they claim that they want to help people on to the property ladder but the small print of their policies reveal how they intend to price even more people out of the housing market. the flagship tory proposal on housing policy is to give a million more housing association tenants the right to buy their homes. labour has said it will allow 300 000 council and housing association tenants to buy a share in their homes. housing minister keith hill said much of the lib dem plans mimicked the government s strategy. however as usual the lib dems proposals are completely uncosted he said. mr hill said he also asked whether the lib dems would match labour s promise to spend £42bn on making refurbishing and repair council homes by 2010.'</li></ul> |
| 0 | <ul><li>'souped-up wi-fi is on the horizon super high-speed wireless data networks could soon be in use in the uk. the government s wireless watchdog is seeking help on the best way to regulate the technology behind such networks called ultra wideband (uwb). ofcom wants to ensure that the arrival of uwb-using devices does not cause problems for those that already use the same part of the radio spectrum. uwb makes it possible to stream huge amounts of data through the air over short distances. one of the more likely uses of uwb is to make it possible to send dvd quality video images wirelessly to tv screens or to let people beam music to media players around their home. the technology has the potential to transmit hundreds of megabits of data per second. uwb could also be used to create so-called personal area networks that let a person s gadgets quickly and easily swap data amongst themselves. the technology works over a range up to 10 metres and uses billions of short radio pulses every second to carry data. at the recent consumer electronics show in las vegas products with uwb chips built-in got their first public airing. currently use of uwb is only allowed in the uk under a strict licencing scheme. we re seeking opinion from industry to find out whether or not we should allow uwb on a licence-exempt basis said a spokesman for ofcom. companies have until 24 march to respond. in april the ec is due to start its own consultation on europe-wide adoption of uwb. the cross-europe body for radio regulators known as the european conference of postal and telecommunications administrations (cept) is carrying out research for this harmonisation programme. early sight of the cept work has caused controversy as some think it over-emphasises uwb s potential to interfere with existing users. by contrast a preliminary ofcom report found that it would be quite straight-forward to deploy uwb without causing problems for those that already use it. the ofcom spokesman said it was considering imposing a mask or set of technical restrictions on uwb-using devices. we would want these devices to have very strict controls on power levels so they can not transmit a long way or over a wide area he said. despite the current restrictions the technology is already being used. cambridge-based ubisense has about 40 customers around the world using the short-range radio technology said david theriault standards and regulatory liaison for ubisense. he said that uwb was driving novel ways to interact with computers. it s like having a 3d mouse all the time he said. he said that european decisions on what to do with uwb allied with ieee decisions on the exact specifications for it would help drive adoption. prior to its adoption as a way for gadgets and computers to communicate uwb was used as a sensing technology. it is used to spot such things as cracks under the surface of runways or to help firemen detect people through walls.'</li><li>'microsoft plans safer id system microsoft is planning to make windows and internet explorer more secure by including software to give people more control over personal information. info cards will help people manage personal details on their pcs to make online services safer said microsoft. microsoft s two previous programs passport and hailstorm aimed to protect users but were criticised. id fraud is one of the uk s fastest-growing crimes with criminals netting an estimated £1.3bn last year. a quarter of uk adults has either had their id stolen via hi-tech or other means or knows someone who has a recent report by which magazine found. microsoft is developing a new version of internet explorer browser and its operating system windows which has been code-named longhorn. michael stephenson director in microsoft s windows server division would not confirm however whether the new info cards id system will be built into the current windows xp version or longhorn. we re trying to make the end-user experience as simple as possible mr stephenson said. the system would differ from its previous attempts to make online transactions more secure said microsoft. while passport and hailstorm stored user information centrally on the net the latest system will store data on a user s pc. it s going to put control of digital ids into the hands of an end-user the end-user will be in full control said mr stephenson. hailstorm was criticised by privacy campaigners for putting too much sensitive information into the hands of a single company. passport provides a single log-in for more than one website and stores basic personal information. but its popularity suffered after security scares. up to 200 million passport accounts were left vulnerable to online theft and malicious hackers after a flaw in the system was exploited in 2003. online auction site ebay stopped supporting it in january 2005. although the flaw was fixed microsoft has come under regular criticism for the number of security loopholes in internet explorer. last year it released a major security update for windows service pack 2 to combat some of the security concerns. longhorn is due to be released commercially in late 2006 but an updated version of internet explorer is due for release later this year.'</li><li>'casual gaming to take off games aimed at casual players are set to be even bigger in 2005 according to industry experts. easy-to-play titles that do not require too much time and that are playable online or downloadable to mobile devices will see real growth in the coming year. the trend shows that gaming is not just about big-hitting games console titles which appeal more to hardcore gamers said a panel of experts. they were speaking before the annual consumer electronics show in las vegas which showcases the latest trends in gadgets and technologies for 2005. the panel also insisted that casual gamers were not just women a common misconception which pervades current thinking about gamer demographics. casual games like poker pool bridge bingo and puzzle-based titles which can be played online or downloaded onto mobile devices were gender neutral and different genres attracted different players. greg mills program director at aol said its figures suggested that sports-based games attracted 90% of 18 to 24-year-old males while puzzle games were played by 80% of females. games like bridge tended to attract the over-50 demographic of gamers. but hardcore gamers who are more attracted to blockbuster gamers which usually require hi-spec pcs like half-life 2 or halo 2 on xbox also liked to have a different type of gaming experience. when hardcore gamers are not playing halo they are playing poker and pool based on our research said geoff graber director of yahoo games which attracts about 12 million gamers a month. with the growth of powerful pc technology and ownership broadband take-up portable players and mobile devices as well as interactive tv casual gaming is shaping up to be big business in 2005 according to the panel. the focus for the coming year should be about attracting third-party developers into the field to offer more innovative and multiplayer titles they agreed. we are at a time where we are on the verge of something much bigger said mr graber. casual games will get into their stride in 2005 will be really big in 2006 and will be about community. with more people finding more to do with their gadgets and high-speed connections casual games would start to open up the world of gaming as a form of mass-market entertainment to more people. key to these types of titles is the chance they give people who may not see themselves as gamers to dip in and out of games when they liked. portal sites which offer casual games like aol yahoo and realarcade as well as other games-on-demand services allow people to build up buddy lists so they can return and play against the same people. this aspect of community is crucial for gamers who just want to have quick access to free or cheap games without committing long periods of time immersed in £30 to £40 console or pc titles said the panel. about 120 000 people are expected to attend the ces trade show which stretches over more than 1.5 million square feet and which officially runs from 6 to 9 january. the main theme is how new devices are getting better at talking to each other allowing people to enjoy digital content like audio video and images when they want and where they want.'</li></ul> |
| 2 | <ul><li>'woodward eyes brennan for lions toulouse s former irish international trevor brennan could be one of clive woodward s many surprises when the 44-man lions tour squad is announced. brennan who last played for ireland against samoa in 2001 is held in high esteem by the former england coach. if you speak to the players there s a huge amount of respect for the guy woodward told the sunday independent. players tend to know better than most coaches. it s not just the irish but welsh and english players as well. the 31-year-old former dublin milkman moved from leinster to toulouse in 2003 and immediately picked up a heineken cup winner s medal in an all-french final against perpignan at lansdowne road. brennan is highly-rated at stade toulousain where he is used anywhere in the back five. woodward is ensuring his preparations for the trip to new zealand in june are as thorough as possible. i ve spoken to quite a few players and they probably don t know what they re actually saying when we re having these conversations he told the newspaper. but you talk about certain players and they ll say if they think they re up to scratch or that they don t want them in their team. i haven t heard a bad word said against trevor which considering he has a pretty tough guy reputation is to me impressive.'</li><li>'off-colour gardener storms to win britain s jason gardener shook off an upset stomach to win the 60m at sunday s leipzig international meeting. gardener clocked 6.56 seconds to equal the meeting record and finished well ahead of germany s marc blume who crossed the line in 6.67 secs. the world indoor champion said: i got to the airport and my stomach was upset and i was vomiting. i almost went home. i felt a little better sunday morning but decided i d only run in the main race. then everything went perfectly. gardener part of the great britain 4x100m quartet that won gold at the athens olympics will now turn his attention to next weekend s norwich union european indoor trials in sheffield. given i am still off-colour i know there is plenty more in the tank and i expect to get faster in the next few weeks he said. it s just a case of chipping away as i have done in previous years and the results will come. scotland s ian mackie was also in action in leipzig. he stepped down from his favoured 400m to 200m to finish third in 21.72 secs. germany s alexander kosenkow won the race in 21.07 secs with dutchman patrick van balkom second in 21.58 secs. there were plenty of other senior british athletes showing their indoor form over the weekend. promising 60m hurdler clocked a new uk record of 7.98 seconds at a meeting in norway. the 24-year-old reached the mark in her heat but had to settle for joint first place with former aaa champion diane allahgreen in the final. who broke onto the international scene at the olympic games last season set an indoor personal best of 16.50m in the triple jump at a meeting in ghent. that leap - 37cm short of brazilian winner jadel gregorio s effort - was good enough to qualify for the european indoor championships. at the same meeting finished third in 7.27 seconds in a high-class women s 60m. the event was won by european medal favourite christine arron of france while belgium rival kim gevaert was second. britain s joice maduaka finished fifth in 7.35. olympic bronze heptathlon medallist made a low-key return to action at an indoor meeting in birmingham. the 28-year-old cleared 1.76m to win the high jump and threw 13.86m in the women s shot put.'</li><li>'parry relishes anfield challenge bbc sport reflects on the future for liverpool after our exclusive interview with chief executive rick parry. chief executive parry is the man at the helm as liverpool reach the most crucial point in their recent history. parry has to deliver a new 60 000-seat stadium in stanley park by 2007 amid claims of costs spiralling above £120m. he is also searching for an investment package of a size and stature that will restore liverpool to their place at european football s top table. but it is a challenge that appears to sit easily with parry who has forged a reputation as one of football s most respected administrators since his days at the fledgling premier league. liverpool have not won the championship since 1990 a fact that causes deep discomfort inside anfield as they attempt to muscle in on the top three of chelsea manchester united and arsenal. throw in the small matter of warding off every top club in world football as they eye captain steven gerrard and you can see parry is a man with a lot on his plate. but in the comfort of a conference room deep inside liverpool s heartbeat - the kop end - parry spoke to us with brutal honesty about the crucial months ahead. he only dodged one question - when asked to reveal the name of the mystery investor currently courting liverpool a polite smile deflected the inquiry. but to his credit he met everything else head on in measured tones that underscore the belief that liverpool still mean business. by business he means becoming title challengers again and locking the pieces together that will help return the trophy to liverpool is parry s mission. parry has already successfully put one of those planks in place in the form of new manager rafael benitez. and his enthusiasm for the spaniard s personality and methods is an indication of his clear feeling that he has struck gold. benitez s early work has given parry renewed optimism about the years ahead. but it remains a massive task at a club with a unique history and expectations. this will not come as news to parry a lifelong liverpool supporter but his quiet determination suggests he is no mood to be found wanting... captain gerrard is central to liverpool s plans and parry s insistence that all offers will be refused is a firm statement of intent. as ever the player will have the final say and parry acknowledges that but he is determined to provide the framework and environment for liverpool and gerrard to flourish. in terms of the search for new investment hawkpoint were appointed as advisors to flush out interest in march 2004. thailand prime minister thaksin shiniwatra came and went while the most serious statement of intent came from tycoon and lifelong fan steve morgan. morgan had a succession of bids rejected having come close in the summer only for talks to break down over potential costs for the new stadium. bbc sport understands morgan is still ready and willing to invest in liverpool and parry has kept the door ajar despite currently seeking investment elsewhere. morgan however has had no formal contact with liverpool or their advisors since last december blaming indecision at board level as he publicly withdrew his £70m offer. he was also convinced his interest was being used to lure in others so any new approach would now have to come from liverpool. morgan will certainly not be making another call. so speculation continues about the new benefactor with trails leading to the middle east and america but all met with an understandable veil of secrecy from anfield. parry meanwhile sees the new ground as crucial to liverpool s future but is refusing to become emotionally attached to the idea. he is determined the ground will only be built on an affordable basis and will not make future liverpool management hostages to the new stadium. parry will pull back the moment the figures do not stack up but there has been a vital new development in north london that has re-shaped liverpool s thinking. liverpool have publicly refused to entertain the idea of stadium sponsorship and potential naming rights - but the realism of arsenal s stunning £100m deal for their new emirates stadium at ashburton has changed the landscape. parry labelled the deal an eye-opener and admits liverpool would be missing a trick not to explore the possibilities. he knows some traditionalist liverpool fans will reel at any attempt to call the new stadium anything other than just anfield but the maths of modern-day football decree that multi-millions for stadium and team could ease the pain. i would take £50m if we had no investment but if we did keep him. as for the stadium if it gets us cash what difference does it make really £50m for gerrard i don t care who you are the directors would take the money and it is the way it should be. we cannot let that sum of money go despite gerrard s quality. through a cleverly worded statement the club has effectively forced gerrard to publicly make the decision for himself which i think is the right thing to do. critical time for liverpool with regards to gerrard. ideally we would want to secure his future to the club for the long term. i am hoping he doesn t walk out of the club like michael owen did for very little cash. £50m realistically would allow rafa to completely rebuild the squad however if we can afford to do this and keep gerrard we will be better for it. i would however be happy with gerrard s transfer for any fee over £35m. parry s statements are clever in that any future gerrard transfer cannot be construed as a lack of ambition by the club to not try and keep their best players. upping the ante is another smart move by parry. i would keep gerrard. no amount of money could replace his obvious love of the club and determination to succeed. the key is if gerrard comes out and says that he is happy. clearly if he isn t then we would be foolish not to sell. the worrying thing is who would you buy (or who would come) pending possible non-champions league football.'</li></ul> |
| 3 | <ul><li>'rem announce new glasgow concert us band rem have announced plans to perform for 10 000 scottish fans in a rescheduled gig. the band will play in what has been dubbed europe s biggest tent on glasgow green on tuesday 14 june. they were forced to pull out of a concert at the secc in glasgow last month after bassist mike mills contracted flu. fans who bought tickets for the original 22 february show can attend the rescheduled concert. the june gig will act as a warm-up for rem s open air concert at balloch castle country park on the banks of loch lomond four days later. promoters regular music booked glasgow green as the secc was not available on the most suitable date. mark mackie director of regular music said: it is fantastic news and it really shows rem s commitment to their scottish fans that they are coming back to glasgow for what will be a truly unique gig. the rem gigs will kick-start what promises to be a memorable summer for scottish music lovers. grammy award winners u2 will play hampden on 21 june while oasis will also perform at the national stadium in glasgow on 29 june. coldplay have announced a concert at bellahouston park in glasgow on 1 july and t in the park will be held at balado near kinross from 9-10 july. ticketweb and the secc box office will write to customers who bought tickets for the february gig asking if they want to attend the new show. those who bought tickets in person are being urged to return to the point of purchase. anyone who cannot make the concert will be given a refund. the cut-off date for swapping tickets is 1 april when those remaining will go on sale to the public.'</li><li>'tautou film tops cesar prize nods french film a very long engagement has received 12 nominations for france s cesar film awards despite a recent ruling it was not french enough . the world war i romantic drama starring audrey tautou was recently ruled too american by a paris court as it was partially backed by warner bros. but the cesar organisers modified their rules to allow the film to compete. the film directed by jean-pierre jeunet received best actress picture and director nominations. last november a court judged the film was too american to compete in french film festivals. two associations of french producers challenged jeunet s right to french government subsidies because warner bros was a backer. the ruling meant the movie - which was filmed in france and used french actors and technicians - was not eligible to compete for french prizes. but alain terzian president of cesar organisers the academie des arts et techniques du cinema said the changes in eligibility rules which allow films of french expression were made three months prior to the court decision. other films in the best film category include police drama 36 quai des orfevres arnaud desplechin s kings and queen abdellatif kechiche s l esquive and france s number one film at the 2004 box-office the chorus. best actors are daniel auteuil for 36 mathieu amalric for kings and queen gerard jugnot for the chorus philippe torreton for l equipier and benoit poelvoorde for podium. tautou will compete against maggie cheung emmanuelle devos yolande moreau and karin viard for best actress. michael moore s fahrenheit 9/11 the motorcycle diaries lost in translation eternal sunshine of the spotless mind and 21 grams are all vying in the best foreign film prize. the awards ceremony will be held on 26 february. this year will smith star of i robot independence day and men in black will be given an honorary cesar along with french singer/actor jacques dutronc.'</li><li>'gallery unveils interactive tree a christmas tree that can receive text messages has been unveiled at london s tate britain art gallery. the spruce has an antenna which can receive bluetooth texts sent by visitors to the tate. the messages will be unwrapped by sculptor richard wentworth who is responsible for decorating the tree with broken plates and light bulbs. it is the 17th year that the gallery has invited an artist to dress their christmas tree. artists who have decorated the tate tree in previous years include tracey emin in 2002. the plain green norway spruce is displayed in the gallery s foyer. its light bulb adornments are dimmed ordinary domestic ones joined together with string. the plates decorating the branches will be auctioned off for the children s charity artworks. wentworth worked as an assistant to sculptor henry moore in the late 1960s. his reputation as a sculptor grew in the 1980s while he has been one of the most influential teachers during the last two decades. wentworth is also known for his photography of mundane everyday subjects such as a cigarette packet jammed under the wonky leg of a table.'</li></ul> |
| 1 | <ul><li>'ask jeeves tips online ad revival ask jeeves has become the third leading online search firm this week to thank a revival in internet advertising for improving fortunes. the firm s revenue nearly tripled in the fourth quarter of 2004 exceeding $86m (£46m). ask jeeves once among the best-known names on the web is now a relatively modest player. its $17m profit for the quarter was dwarfed by the $204m announced by rival google earlier in the week. during the same quarter yahoo earned $187m again tipping a resurgence in online advertising. the trend has taken hold relatively quickly. late last year marketing company doubleclick one of the leading providers of online advertising warned that some or all of its business would have to be put up for sale. but on thursday it announced that a sharp turnaround had brought about an unexpected increase in profits. neither ask jeeves nor doubleclick thrilled investors with their profit news however. in both cases their shares fell by some 4%. analysts attributed the falls to excessive expectations in some quarters fuelled by the dramatic outperformance of google on tuesday.'</li><li>'us bank boss hails genius smith us federal reserve chairman alan greenspan has given a speech at a scottish church in honour of the pioneering economist adam smith. he delivered the 14th adam smith lecture in kirkcaldy fife. the adam smith lecture celebrates the author of 1776 s wealth of nations which became a bible of capitalism. dr greenspan was invited by chancellor gordon brown whose minister father john used to preach at the st bryce kirk church. mr brown introduced dr greenspan to the 400 invited guests as the the world s greatest economist . dr greenspan 79 who has been in the uk to attend the g7 meeting in london said the world could never repay the debt of gratitude it owed to smith whose genius he compared to that of mozart. he said the philosopher was a towering contributor to the modern world . kirkcaldy the birthplace in 1723 of adam smith and by extension of modern economics is also of course where your chancellor was reared. i am led to ponder to what extent the chancellor s renowned economic and financial skills are the result of exposure to the subliminal intellect-enhancing emanation in this area. he continued: smith reached far beyond the insights of his predecessors to frame a global view of how market economics just then emerging worked. in so doing he supported changes in societal organisation that were to measurably enhance standards of living. dr greenspan said smith s revolutionary philosophy on human self-interest laissez-faire economics and competition had been a force for good in the world. the incredible insights of a handful of intellectuals of the enlightenment - especially with smith toiling in the environs of kirkcaldy - created the modern vision of people free to choose and to act according to their individual self-interest he said. following his lecture dr greenspan - who received an honorary knighthood from the queen at balmoral in 2002 - was awarded an honorary fellowship of the royal society of edinburgh. he later opened an exhibition dedicated to smith in the atrium of fife college of further and higher education. joyce johnston principal of the college said: it is very fitting that the world s premier economist delivered this lecture in tribute to the world s first economist. dr greenspan - who became chairman of the federal reserve for an unprecedented fifth term in june 2004 - will step down in january next year. he has served under presidents george w bush bill clinton george bush and ronald reagan. he was also chairman of the council of economic advisors to gerald ford.'</li><li>'hariri killing hits beirut shares shares in solidere the lebanese company founded by assassinated former prime minister rafik hariri fell 15% in renewed trading in beirut. the real estate firm which dominates lebanon s stock exchange ended the day down at $8.08. traders said there was some panic selling during friday s session the first since a three-day market closure to mourn the death of mr hariri. beirut s benchmark blom stock index closed down 7.9% at 642.80. solidere in which mr hariri was a major shareholder was the major drag on the index. the company owns much of the property in central beirut which it restored and redeveloped following the end of lebanon s bitter 15-year civil war. solidere should be above $10 but because of this disaster it is falling said one trader. if solidere drops much lower i would consider it a buying opportunity. this is a very big company held by many lebanese. critics had accused mr hariri of using lebanon s post-war reconstruction drive for his personal financial gain. but his assassination on monday sent shudders through lebanon s business community which saw the billionaire tycoon as the country s best hope for economic revival. solidere posted profits of $12.5m in the first half of 2004 and its shares had been gaining in recent months.'</li></ul> |
## Evaluation
### Metrics
| Label | Accuracy |
|:--------|:---------|
| **all** | 0.953 |
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("vidhi0206/setfit-paraphrase-mpnet-base-v2")
# Run inference
preds = model("versace art portfolio up for sale the art collection of murdered fashion designer gianni versace could fetch up to £9m ($17m) when it is auctioned in new york and london later this year. among the pictures for sale are works by roy lichtenstein andy warhol and henri matisse. the collection was housed at versace s six-storey new york townhouse. the 51-year-old designer was shot outside his florida home in 1997 by suspected serial killer andrew cunanan who later killed himself. the auction at sotheby s will feature 45 contemporary impressionist and 19th century paintings. one of the highlights of the sale is roy lichtenstein s blue nude which has been given an estimate of £1.8m ($3.4m). tobias meyer sotheby s worldwide head of contemporary art said: this collection reflects mr versace s wide-ranging taste and impeccable eye and many of the works were commissioned directly from the artists. outstanding later examples from champions of the pop movement such as roy lichtenstein are juxtaposed with masterpieces from the most visible artists of the 1980 s including jean-michel basquiat and the collaborative genius of basquiat and warhol as well as francesco clemente. much of the collection will be offered for sale at three auctions in new york in june with smaller contemporary paintings going under the hammer in london on 22 and 23 june. a sale of versace s furniture and artworks sold in 2001fetched £5.5m ($10.3m).")
```
<!--
### Downstream Use
*List how someone could finetune this model on their own dataset.*
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:--------|:-----|
| Word count | 173 | 419.325 | 1121 |
| Label | Training Sample Count |
|:------|:----------------------|
| 0 | 8 |
| 1 | 8 |
| 2 | 8 |
| 3 | 8 |
| 4 | 8 |
### Training Hyperparameters
- batch_size: (8, 8)
- num_epochs: (1, 1)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 20
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:-----:|:----:|:-------------:|:---------------:|
| 0.005 | 1 | 0.245 | - |
| 0.25 | 50 | 0.0174 | - |
| 0.5 | 100 | 0.0008 | - |
| 0.75 | 150 | 0.0005 | - |
| 1.0 | 200 | 0.0002 | - |
### Framework Versions
- Python: 3.8.10
- SetFit: 1.0.3
- Sentence Transformers: 2.3.1
- Transformers: 4.37.2
- PyTorch: 2.2.0+cu121
- Datasets: 2.17.0
- Tokenizers: 0.15.1
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | [
"TEXT_CLASSIFICATION",
"TRANSLATION"
]
| [
"MEDAL"
]
| Non_BioNLP |
# SetFit with sentence-transformers/paraphrase-mpnet-base-v2
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2)
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
- **Maximum Sequence Length:** 512 tokens
- **Number of Classes:** 5 classes
<!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
### Model Labels
| Label | Examples |
|:------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 4 | <ul><li>'peace demo appeal rejected peace protestors have lost a landmark appeal over police actions in stopping an anti-war demonstration days after the start of the iraq war. they had appealed against a high court decision that it was not unlawful for police to forcibly turn protestors away near raf fairford glos in 2003. the police had also sought to overturn a breach of human rights ruling arising from the same case. sitting on wednesday three appeal court judges dismissed both appeals. they were challenging decisions by two judges in the high court in february this year. it followed action by police when three coachloads of people were searched and detained on the way to raf fairford and forced to return to london under police escort. the demonstrators appealed against a finding by lord justice may and mr justice harrison that it was not unlawful for the police to turn the passengers away. the police were urging lord chief justice and lord justices clarke and rix to overturn the ruling that they had breached the protestors human rights by detaining them in the coaches. craig mackey assistant chief constable of gloucestershire police said: we have always considered that our responses were proportionate and all our decisions on the day were based on intelligence. he said no one on the coaches accepted responsibility for items found on the coaches including body armour a smoke bomb and five shields. given these circumstances and the fact that raf fairford and other military installations in the uk had been the scene of increasingly destructive disorder in the weeks preceding this incident the police commander on the ground made the decision to turn back the coaches. from day one we have vigorously defended this decision which was made out of a genuine concern that if the coaches were allowed to proceed it would have resulted in disorder and criminal damage at raf fairford. fairford coach action representing more than 80 people who appealed against the police actions say they are prepared to take their case to the european court of human rights. their action is supported by amnesty international and liberty.'</li><li>'kilroy launches veritas party ex-bbc chat show host and east midlands mep robert kilroy-silk has said he wants to change the face of british politics as he launched his new party. mr kilroy-silk who recently quit the uk independence party said our country was being stolen from us by mass immigration. he told a london news conference that veritas - latin for truth - would avoid the old parties lies and spin . ukip leader roger knapman says he was glad to see the back of mr kilroy-silk. mr kilroy-silk promised a firm but fair policy on immigration and said they hoped to contest most seats at the forthcoming general election. he said veritas would also announce detailed policies on crime tax pensions health and defence over the next few weeks. and he announced the party would be holding a leadership election. on thursday he is due to announce which constituency he will run in at the next general election - that will come amid speculation he has his sights set on defence secretary geoff hoon s ashfield seat. he was joined in the new venture by one of ukip s two london assembly members damien hockney who is now veritas deputy leader. ukip s chairman petrina holdsworth has said the group will just be a parody of the party the men have left. mr kilroy-silk announced his decision to quit ukip at a public meeting in hinckley leicestershire last week. it came after months of tension as he vied unsuccessfully for the leadership of that party. he said he was ashamed to be a member of a ukip whose leadership had gone awol after the great opportunity offered by its third place at last june s european elections. while ukip has turned its back on the british people i shall not he said. i will be standing at the next general election. i shall be leading a vigorous campaign for the causes i believe in. and unlike the old parties we shall be honest open and straight. mr hockney also left ukip saying mr kilroy-silk would deliver better as the leader of a eurosceptic party. a spokesman for ukip called on mr hockney to quit the london assembly. the party asserts that mr hockney has a moral obligation if not a legal one to stand down. its leader roger knapman has said he is glad to see the back of mr kilroy-silk. he has remarkable ability to influence people but sadly after the [european] election it became clear that he was more interested in the robert kilroy-silk party than the uk independence party so it was nice knowing him now goodbye he said. ukip officials also argue mr kilroy-silk has not been straightforward in attacking the party he once wanted to lead. this is just what the europhiles pray for. as the main eurosceptic party ukip should try to resolve its differences with kilroy to show a united front and give the uk public a serious political voice against europe. having multiple parties with the same view point just splits the vote further. thank goodness that kilroy-silk has gone - now ukip at least has a chance in the election! it is very sad to see the cause of britain regaining its proper relationship with europe damaged by this split within ukip. robert kilroy-silk could have a lot to offer. instead we have a split party and a damaged cause. under the present electoral system people must work together and small parties have no hope of representation. last summer ukip achieved a major advance partly and only partly due to kilroy-silk. it is a great shame this has been dissipated in in-fighting. ukip has a wide platform of policies not just withdrawal from the eu. this kilroy-silk conveniently ignores in the comments surrounding the launch of his own party. neither the english democrats nor the new party were interested in letting him join them and take over their leadership speaks volumes. veritas is the beginning of the end for kilroy-silk. if he believes in truth and democracy then he and the two assembly members should resign and force a by-elections to stand on their own platform rather than this backdoor approach to politics of being elected for one party then defecting to another. so ukip was good enough for him to lead not good enough for him to follow! interesting that a party committed to plain speaking should have a latin name! every opinion poll points to an overwhelming anti-europe feeling in this country. kilroy-silk could be on the verge of something huge if he can broaden his appeal beyond this one issue. he is an extremely able communicator with years of political experience. we wants quality schools top hospitals clean and efficient public transport punishments that fit the crime limited asylum a purge on bureaucracy and less taxes. it needs courage and honesty two qualities sadly lacking in our politicians. kilroy-silk may just have those very qualities. recruit the right colleagues robert and your time may have come! well if you cannot get enough limelight being an ordinary mp then go out and start up your own party. it s all flash and no real policy here let s hope this is the start of both ukip and kilroy-silk slipping into obscurity. veritas the name will doom it. but perhaps i am wrong for surely all modern schoolchildren will understand it since they do still learn latin in the classroom do they not the whole essence of what rks represents is euroscepticism so explain to me how the too-twee label of veritas symbolises that'</li><li>'lib dems target first-time buyers the liberal democrats have unveiled plans to build 100 000 new affordable homes on publicly owned land. the party s scheme would allow people to buy a share in a home through a mutual home ownership trust as a way of getting onto the housing ladder. the lib dems would also encourage the conversion of existing buildings in an effort to protect greenfield sites. labour has already announced plans to help first-time buyers and the tories would extend right-to-buy schemes. all the major parties are focusing on the issue in the run-up to the election after a survey suggested first-time buyers could not afford a home in 92% of uk towns. the lib dems say their mutual homes would let people buy a share of a property usually worth about 5% of the building costs. party leader charles kennedy said the homes would be affordable because they would be built on surplus public sector land donated by central or local government. people would also only have to pay for the cost of the building and not the land he added. they would spend about 30% of their monthly salary on rent and buying extra shares in the property. when they moved house they would be able to cash in on any rise in property prices by selling their share. it would also allow councils to vary discounts to tenants given the right to buy their council homes so local needs were taken into account. mr kennedy said: mutual homes will offer people the opportunity to build up an equity stake in a home gradually investing only as much as they can afford. there are also plans to prevent high house prices forcing people out of their local communities. the kind of golden share used by the lib dems in south shropshire could be rolled out more widely. under the plan councils secure deals with developers where they keep a 1% share in a property scheme so properties cannot be sold on the open market. instead they are sold at build cost to people who the local council decides have local needs. the party says its help for first-time buyers can be funded at no extra cost to the taxpayer. but the plans involve changing the vat system which the party says often makes it too expensive to renovate existing buildings. the conservatives claimed the plans would amount to an extra tax of up to £11 000 on every new house. this is typical of lib dem hypocrisy said tory shadow local government secretary caroline spelman. they claim that they want to help people on to the property ladder but the small print of their policies reveal how they intend to price even more people out of the housing market. the flagship tory proposal on housing policy is to give a million more housing association tenants the right to buy their homes. labour has said it will allow 300 000 council and housing association tenants to buy a share in their homes. housing minister keith hill said much of the lib dem plans mimicked the government s strategy. however as usual the lib dems proposals are completely uncosted he said. mr hill said he also asked whether the lib dems would match labour s promise to spend £42bn on making refurbishing and repair council homes by 2010.'</li></ul> |
| 0 | <ul><li>'souped-up wi-fi is on the horizon super high-speed wireless data networks could soon be in use in the uk. the government s wireless watchdog is seeking help on the best way to regulate the technology behind such networks called ultra wideband (uwb). ofcom wants to ensure that the arrival of uwb-using devices does not cause problems for those that already use the same part of the radio spectrum. uwb makes it possible to stream huge amounts of data through the air over short distances. one of the more likely uses of uwb is to make it possible to send dvd quality video images wirelessly to tv screens or to let people beam music to media players around their home. the technology has the potential to transmit hundreds of megabits of data per second. uwb could also be used to create so-called personal area networks that let a person s gadgets quickly and easily swap data amongst themselves. the technology works over a range up to 10 metres and uses billions of short radio pulses every second to carry data. at the recent consumer electronics show in las vegas products with uwb chips built-in got their first public airing. currently use of uwb is only allowed in the uk under a strict licencing scheme. we re seeking opinion from industry to find out whether or not we should allow uwb on a licence-exempt basis said a spokesman for ofcom. companies have until 24 march to respond. in april the ec is due to start its own consultation on europe-wide adoption of uwb. the cross-europe body for radio regulators known as the european conference of postal and telecommunications administrations (cept) is carrying out research for this harmonisation programme. early sight of the cept work has caused controversy as some think it over-emphasises uwb s potential to interfere with existing users. by contrast a preliminary ofcom report found that it would be quite straight-forward to deploy uwb without causing problems for those that already use it. the ofcom spokesman said it was considering imposing a mask or set of technical restrictions on uwb-using devices. we would want these devices to have very strict controls on power levels so they can not transmit a long way or over a wide area he said. despite the current restrictions the technology is already being used. cambridge-based ubisense has about 40 customers around the world using the short-range radio technology said david theriault standards and regulatory liaison for ubisense. he said that uwb was driving novel ways to interact with computers. it s like having a 3d mouse all the time he said. he said that european decisions on what to do with uwb allied with ieee decisions on the exact specifications for it would help drive adoption. prior to its adoption as a way for gadgets and computers to communicate uwb was used as a sensing technology. it is used to spot such things as cracks under the surface of runways or to help firemen detect people through walls.'</li><li>'microsoft plans safer id system microsoft is planning to make windows and internet explorer more secure by including software to give people more control over personal information. info cards will help people manage personal details on their pcs to make online services safer said microsoft. microsoft s two previous programs passport and hailstorm aimed to protect users but were criticised. id fraud is one of the uk s fastest-growing crimes with criminals netting an estimated £1.3bn last year. a quarter of uk adults has either had their id stolen via hi-tech or other means or knows someone who has a recent report by which magazine found. microsoft is developing a new version of internet explorer browser and its operating system windows which has been code-named longhorn. michael stephenson director in microsoft s windows server division would not confirm however whether the new info cards id system will be built into the current windows xp version or longhorn. we re trying to make the end-user experience as simple as possible mr stephenson said. the system would differ from its previous attempts to make online transactions more secure said microsoft. while passport and hailstorm stored user information centrally on the net the latest system will store data on a user s pc. it s going to put control of digital ids into the hands of an end-user the end-user will be in full control said mr stephenson. hailstorm was criticised by privacy campaigners for putting too much sensitive information into the hands of a single company. passport provides a single log-in for more than one website and stores basic personal information. but its popularity suffered after security scares. up to 200 million passport accounts were left vulnerable to online theft and malicious hackers after a flaw in the system was exploited in 2003. online auction site ebay stopped supporting it in january 2005. although the flaw was fixed microsoft has come under regular criticism for the number of security loopholes in internet explorer. last year it released a major security update for windows service pack 2 to combat some of the security concerns. longhorn is due to be released commercially in late 2006 but an updated version of internet explorer is due for release later this year.'</li><li>'casual gaming to take off games aimed at casual players are set to be even bigger in 2005 according to industry experts. easy-to-play titles that do not require too much time and that are playable online or downloadable to mobile devices will see real growth in the coming year. the trend shows that gaming is not just about big-hitting games console titles which appeal more to hardcore gamers said a panel of experts. they were speaking before the annual consumer electronics show in las vegas which showcases the latest trends in gadgets and technologies for 2005. the panel also insisted that casual gamers were not just women a common misconception which pervades current thinking about gamer demographics. casual games like poker pool bridge bingo and puzzle-based titles which can be played online or downloaded onto mobile devices were gender neutral and different genres attracted different players. greg mills program director at aol said its figures suggested that sports-based games attracted 90% of 18 to 24-year-old males while puzzle games were played by 80% of females. games like bridge tended to attract the over-50 demographic of gamers. but hardcore gamers who are more attracted to blockbuster gamers which usually require hi-spec pcs like half-life 2 or halo 2 on xbox also liked to have a different type of gaming experience. when hardcore gamers are not playing halo they are playing poker and pool based on our research said geoff graber director of yahoo games which attracts about 12 million gamers a month. with the growth of powerful pc technology and ownership broadband take-up portable players and mobile devices as well as interactive tv casual gaming is shaping up to be big business in 2005 according to the panel. the focus for the coming year should be about attracting third-party developers into the field to offer more innovative and multiplayer titles they agreed. we are at a time where we are on the verge of something much bigger said mr graber. casual games will get into their stride in 2005 will be really big in 2006 and will be about community. with more people finding more to do with their gadgets and high-speed connections casual games would start to open up the world of gaming as a form of mass-market entertainment to more people. key to these types of titles is the chance they give people who may not see themselves as gamers to dip in and out of games when they liked. portal sites which offer casual games like aol yahoo and realarcade as well as other games-on-demand services allow people to build up buddy lists so they can return and play against the same people. this aspect of community is crucial for gamers who just want to have quick access to free or cheap games without committing long periods of time immersed in £30 to £40 console or pc titles said the panel. about 120 000 people are expected to attend the ces trade show which stretches over more than 1.5 million square feet and which officially runs from 6 to 9 january. the main theme is how new devices are getting better at talking to each other allowing people to enjoy digital content like audio video and images when they want and where they want.'</li></ul> |
| 2 | <ul><li>'woodward eyes brennan for lions toulouse s former irish international trevor brennan could be one of clive woodward s many surprises when the 44-man lions tour squad is announced. brennan who last played for ireland against samoa in 2001 is held in high esteem by the former england coach. if you speak to the players there s a huge amount of respect for the guy woodward told the sunday independent. players tend to know better than most coaches. it s not just the irish but welsh and english players as well. the 31-year-old former dublin milkman moved from leinster to toulouse in 2003 and immediately picked up a heineken cup winner s medal in an all-french final against perpignan at lansdowne road. brennan is highly-rated at stade toulousain where he is used anywhere in the back five. woodward is ensuring his preparations for the trip to new zealand in june are as thorough as possible. i ve spoken to quite a few players and they probably don t know what they re actually saying when we re having these conversations he told the newspaper. but you talk about certain players and they ll say if they think they re up to scratch or that they don t want them in their team. i haven t heard a bad word said against trevor which considering he has a pretty tough guy reputation is to me impressive.'</li><li>'off-colour gardener storms to win britain s jason gardener shook off an upset stomach to win the 60m at sunday s leipzig international meeting. gardener clocked 6.56 seconds to equal the meeting record and finished well ahead of germany s marc blume who crossed the line in 6.67 secs. the world indoor champion said: i got to the airport and my stomach was upset and i was vomiting. i almost went home. i felt a little better sunday morning but decided i d only run in the main race. then everything went perfectly. gardener part of the great britain 4x100m quartet that won gold at the athens olympics will now turn his attention to next weekend s norwich union european indoor trials in sheffield. given i am still off-colour i know there is plenty more in the tank and i expect to get faster in the next few weeks he said. it s just a case of chipping away as i have done in previous years and the results will come. scotland s ian mackie was also in action in leipzig. he stepped down from his favoured 400m to 200m to finish third in 21.72 secs. germany s alexander kosenkow won the race in 21.07 secs with dutchman patrick van balkom second in 21.58 secs. there were plenty of other senior british athletes showing their indoor form over the weekend. promising 60m hurdler clocked a new uk record of 7.98 seconds at a meeting in norway. the 24-year-old reached the mark in her heat but had to settle for joint first place with former aaa champion diane allahgreen in the final. who broke onto the international scene at the olympic games last season set an indoor personal best of 16.50m in the triple jump at a meeting in ghent. that leap - 37cm short of brazilian winner jadel gregorio s effort - was good enough to qualify for the european indoor championships. at the same meeting finished third in 7.27 seconds in a high-class women s 60m. the event was won by european medal favourite christine arron of france while belgium rival kim gevaert was second. britain s joice maduaka finished fifth in 7.35. olympic bronze heptathlon medallist made a low-key return to action at an indoor meeting in birmingham. the 28-year-old cleared 1.76m to win the high jump and threw 13.86m in the women s shot put.'</li><li>'parry relishes anfield challenge bbc sport reflects on the future for liverpool after our exclusive interview with chief executive rick parry. chief executive parry is the man at the helm as liverpool reach the most crucial point in their recent history. parry has to deliver a new 60 000-seat stadium in stanley park by 2007 amid claims of costs spiralling above £120m. he is also searching for an investment package of a size and stature that will restore liverpool to their place at european football s top table. but it is a challenge that appears to sit easily with parry who has forged a reputation as one of football s most respected administrators since his days at the fledgling premier league. liverpool have not won the championship since 1990 a fact that causes deep discomfort inside anfield as they attempt to muscle in on the top three of chelsea manchester united and arsenal. throw in the small matter of warding off every top club in world football as they eye captain steven gerrard and you can see parry is a man with a lot on his plate. but in the comfort of a conference room deep inside liverpool s heartbeat - the kop end - parry spoke to us with brutal honesty about the crucial months ahead. he only dodged one question - when asked to reveal the name of the mystery investor currently courting liverpool a polite smile deflected the inquiry. but to his credit he met everything else head on in measured tones that underscore the belief that liverpool still mean business. by business he means becoming title challengers again and locking the pieces together that will help return the trophy to liverpool is parry s mission. parry has already successfully put one of those planks in place in the form of new manager rafael benitez. and his enthusiasm for the spaniard s personality and methods is an indication of his clear feeling that he has struck gold. benitez s early work has given parry renewed optimism about the years ahead. but it remains a massive task at a club with a unique history and expectations. this will not come as news to parry a lifelong liverpool supporter but his quiet determination suggests he is no mood to be found wanting... captain gerrard is central to liverpool s plans and parry s insistence that all offers will be refused is a firm statement of intent. as ever the player will have the final say and parry acknowledges that but he is determined to provide the framework and environment for liverpool and gerrard to flourish. in terms of the search for new investment hawkpoint were appointed as advisors to flush out interest in march 2004. thailand prime minister thaksin shiniwatra came and went while the most serious statement of intent came from tycoon and lifelong fan steve morgan. morgan had a succession of bids rejected having come close in the summer only for talks to break down over potential costs for the new stadium. bbc sport understands morgan is still ready and willing to invest in liverpool and parry has kept the door ajar despite currently seeking investment elsewhere. morgan however has had no formal contact with liverpool or their advisors since last december blaming indecision at board level as he publicly withdrew his £70m offer. he was also convinced his interest was being used to lure in others so any new approach would now have to come from liverpool. morgan will certainly not be making another call. so speculation continues about the new benefactor with trails leading to the middle east and america but all met with an understandable veil of secrecy from anfield. parry meanwhile sees the new ground as crucial to liverpool s future but is refusing to become emotionally attached to the idea. he is determined the ground will only be built on an affordable basis and will not make future liverpool management hostages to the new stadium. parry will pull back the moment the figures do not stack up but there has been a vital new development in north london that has re-shaped liverpool s thinking. liverpool have publicly refused to entertain the idea of stadium sponsorship and potential naming rights - but the realism of arsenal s stunning £100m deal for their new emirates stadium at ashburton has changed the landscape. parry labelled the deal an eye-opener and admits liverpool would be missing a trick not to explore the possibilities. he knows some traditionalist liverpool fans will reel at any attempt to call the new stadium anything other than just anfield but the maths of modern-day football decree that multi-millions for stadium and team could ease the pain. i would take £50m if we had no investment but if we did keep him. as for the stadium if it gets us cash what difference does it make really £50m for gerrard i don t care who you are the directors would take the money and it is the way it should be. we cannot let that sum of money go despite gerrard s quality. through a cleverly worded statement the club has effectively forced gerrard to publicly make the decision for himself which i think is the right thing to do. critical time for liverpool with regards to gerrard. ideally we would want to secure his future to the club for the long term. i am hoping he doesn t walk out of the club like michael owen did for very little cash. £50m realistically would allow rafa to completely rebuild the squad however if we can afford to do this and keep gerrard we will be better for it. i would however be happy with gerrard s transfer for any fee over £35m. parry s statements are clever in that any future gerrard transfer cannot be construed as a lack of ambition by the club to not try and keep their best players. upping the ante is another smart move by parry. i would keep gerrard. no amount of money could replace his obvious love of the club and determination to succeed. the key is if gerrard comes out and says that he is happy. clearly if he isn t then we would be foolish not to sell. the worrying thing is who would you buy (or who would come) pending possible non-champions league football.'</li></ul> |
| 3 | <ul><li>'rem announce new glasgow concert us band rem have announced plans to perform for 10 000 scottish fans in a rescheduled gig. the band will play in what has been dubbed europe s biggest tent on glasgow green on tuesday 14 june. they were forced to pull out of a concert at the secc in glasgow last month after bassist mike mills contracted flu. fans who bought tickets for the original 22 february show can attend the rescheduled concert. the june gig will act as a warm-up for rem s open air concert at balloch castle country park on the banks of loch lomond four days later. promoters regular music booked glasgow green as the secc was not available on the most suitable date. mark mackie director of regular music said: it is fantastic news and it really shows rem s commitment to their scottish fans that they are coming back to glasgow for what will be a truly unique gig. the rem gigs will kick-start what promises to be a memorable summer for scottish music lovers. grammy award winners u2 will play hampden on 21 june while oasis will also perform at the national stadium in glasgow on 29 june. coldplay have announced a concert at bellahouston park in glasgow on 1 july and t in the park will be held at balado near kinross from 9-10 july. ticketweb and the secc box office will write to customers who bought tickets for the february gig asking if they want to attend the new show. those who bought tickets in person are being urged to return to the point of purchase. anyone who cannot make the concert will be given a refund. the cut-off date for swapping tickets is 1 april when those remaining will go on sale to the public.'</li><li>'tautou film tops cesar prize nods french film a very long engagement has received 12 nominations for france s cesar film awards despite a recent ruling it was not french enough . the world war i romantic drama starring audrey tautou was recently ruled too american by a paris court as it was partially backed by warner bros. but the cesar organisers modified their rules to allow the film to compete. the film directed by jean-pierre jeunet received best actress picture and director nominations. last november a court judged the film was too american to compete in french film festivals. two associations of french producers challenged jeunet s right to french government subsidies because warner bros was a backer. the ruling meant the movie - which was filmed in france and used french actors and technicians - was not eligible to compete for french prizes. but alain terzian president of cesar organisers the academie des arts et techniques du cinema said the changes in eligibility rules which allow films of french expression were made three months prior to the court decision. other films in the best film category include police drama 36 quai des orfevres arnaud desplechin s kings and queen abdellatif kechiche s l esquive and france s number one film at the 2004 box-office the chorus. best actors are daniel auteuil for 36 mathieu amalric for kings and queen gerard jugnot for the chorus philippe torreton for l equipier and benoit poelvoorde for podium. tautou will compete against maggie cheung emmanuelle devos yolande moreau and karin viard for best actress. michael moore s fahrenheit 9/11 the motorcycle diaries lost in translation eternal sunshine of the spotless mind and 21 grams are all vying in the best foreign film prize. the awards ceremony will be held on 26 february. this year will smith star of i robot independence day and men in black will be given an honorary cesar along with french singer/actor jacques dutronc.'</li><li>'gallery unveils interactive tree a christmas tree that can receive text messages has been unveiled at london s tate britain art gallery. the spruce has an antenna which can receive bluetooth texts sent by visitors to the tate. the messages will be unwrapped by sculptor richard wentworth who is responsible for decorating the tree with broken plates and light bulbs. it is the 17th year that the gallery has invited an artist to dress their christmas tree. artists who have decorated the tate tree in previous years include tracey emin in 2002. the plain green norway spruce is displayed in the gallery s foyer. its light bulb adornments are dimmed ordinary domestic ones joined together with string. the plates decorating the branches will be auctioned off for the children s charity artworks. wentworth worked as an assistant to sculptor henry moore in the late 1960s. his reputation as a sculptor grew in the 1980s while he has been one of the most influential teachers during the last two decades. wentworth is also known for his photography of mundane everyday subjects such as a cigarette packet jammed under the wonky leg of a table.'</li></ul> |
| 1 | <ul><li>'ask jeeves tips online ad revival ask jeeves has become the third leading online search firm this week to thank a revival in internet advertising for improving fortunes. the firm s revenue nearly tripled in the fourth quarter of 2004 exceeding $86m (£46m). ask jeeves once among the best-known names on the web is now a relatively modest player. its $17m profit for the quarter was dwarfed by the $204m announced by rival google earlier in the week. during the same quarter yahoo earned $187m again tipping a resurgence in online advertising. the trend has taken hold relatively quickly. late last year marketing company doubleclick one of the leading providers of online advertising warned that some or all of its business would have to be put up for sale. but on thursday it announced that a sharp turnaround had brought about an unexpected increase in profits. neither ask jeeves nor doubleclick thrilled investors with their profit news however. in both cases their shares fell by some 4%. analysts attributed the falls to excessive expectations in some quarters fuelled by the dramatic outperformance of google on tuesday.'</li><li>'us bank boss hails genius smith us federal reserve chairman alan greenspan has given a speech at a scottish church in honour of the pioneering economist adam smith. he delivered the 14th adam smith lecture in kirkcaldy fife. the adam smith lecture celebrates the author of 1776 s wealth of nations which became a bible of capitalism. dr greenspan was invited by chancellor gordon brown whose minister father john used to preach at the st bryce kirk church. mr brown introduced dr greenspan to the 400 invited guests as the the world s greatest economist . dr greenspan 79 who has been in the uk to attend the g7 meeting in london said the world could never repay the debt of gratitude it owed to smith whose genius he compared to that of mozart. he said the philosopher was a towering contributor to the modern world . kirkcaldy the birthplace in 1723 of adam smith and by extension of modern economics is also of course where your chancellor was reared. i am led to ponder to what extent the chancellor s renowned economic and financial skills are the result of exposure to the subliminal intellect-enhancing emanation in this area. he continued: smith reached far beyond the insights of his predecessors to frame a global view of how market economics just then emerging worked. in so doing he supported changes in societal organisation that were to measurably enhance standards of living. dr greenspan said smith s revolutionary philosophy on human self-interest laissez-faire economics and competition had been a force for good in the world. the incredible insights of a handful of intellectuals of the enlightenment - especially with smith toiling in the environs of kirkcaldy - created the modern vision of people free to choose and to act according to their individual self-interest he said. following his lecture dr greenspan - who received an honorary knighthood from the queen at balmoral in 2002 - was awarded an honorary fellowship of the royal society of edinburgh. he later opened an exhibition dedicated to smith in the atrium of fife college of further and higher education. joyce johnston principal of the college said: it is very fitting that the world s premier economist delivered this lecture in tribute to the world s first economist. dr greenspan - who became chairman of the federal reserve for an unprecedented fifth term in june 2004 - will step down in january next year. he has served under presidents george w bush bill clinton george bush and ronald reagan. he was also chairman of the council of economic advisors to gerald ford.'</li><li>'hariri killing hits beirut shares shares in solidere the lebanese company founded by assassinated former prime minister rafik hariri fell 15% in renewed trading in beirut. the real estate firm which dominates lebanon s stock exchange ended the day down at $8.08. traders said there was some panic selling during friday s session the first since a three-day market closure to mourn the death of mr hariri. beirut s benchmark blom stock index closed down 7.9% at 642.80. solidere in which mr hariri was a major shareholder was the major drag on the index. the company owns much of the property in central beirut which it restored and redeveloped following the end of lebanon s bitter 15-year civil war. solidere should be above $10 but because of this disaster it is falling said one trader. if solidere drops much lower i would consider it a buying opportunity. this is a very big company held by many lebanese. critics had accused mr hariri of using lebanon s post-war reconstruction drive for his personal financial gain. but his assassination on monday sent shudders through lebanon s business community which saw the billionaire tycoon as the country s best hope for economic revival. solidere posted profits of $12.5m in the first half of 2004 and its shares had been gaining in recent months.'</li></ul> |
## Evaluation
### Metrics
| Label | Accuracy |
|:--------|:---------|
| **all** | 0.953 |
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("vidhi0206/setfit-paraphrase-mpnet-base-v2")
# Run inference
preds = model("versace art portfolio up for sale the art collection of murdered fashion designer gianni versace could fetch up to £9m ($17m) when it is auctioned in new york and london later this year. among the pictures for sale are works by roy lichtenstein andy warhol and henri matisse. the collection was housed at versace s six-storey new york townhouse. the 51-year-old designer was shot outside his florida home in 1997 by suspected serial killer andrew cunanan who later killed himself. the auction at sotheby s will feature 45 contemporary impressionist and 19th century paintings. one of the highlights of the sale is roy lichtenstein s blue nude which has been given an estimate of £1.8m ($3.4m). tobias meyer sotheby s worldwide head of contemporary art said: this collection reflects mr versace s wide-ranging taste and impeccable eye and many of the works were commissioned directly from the artists. outstanding later examples from champions of the pop movement such as roy lichtenstein are juxtaposed with masterpieces from the most visible artists of the 1980 s including jean-michel basquiat and the collaborative genius of basquiat and warhol as well as francesco clemente. much of the collection will be offered for sale at three auctions in new york in june with smaller contemporary paintings going under the hammer in london on 22 and 23 june. a sale of versace s furniture and artworks sold in 2001fetched £5.5m ($10.3m).")
```
<!--
### Downstream Use
*List how someone could finetune this model on their own dataset.*
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:--------|:-----|
| Word count | 173 | 419.325 | 1121 |
| Label | Training Sample Count |
|:------|:----------------------|
| 0 | 8 |
| 1 | 8 |
| 2 | 8 |
| 3 | 8 |
| 4 | 8 |
### Training Hyperparameters
- batch_size: (8, 8)
- num_epochs: (1, 1)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 20
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:-----:|:----:|:-------------:|:---------------:|
| 0.005 | 1 | 0.245 | - |
| 0.25 | 50 | 0.0174 | - |
| 0.5 | 100 | 0.0008 | - |
| 0.75 | 150 | 0.0005 | - |
| 1.0 | 200 | 0.0002 | - |
### Framework Versions
- Python: 3.8.10
- SetFit: 1.0.3
- Sentence Transformers: 2.3.1
- Transformers: 4.37.2
- PyTorch: 2.2.0+cu121
- Datasets: 2.17.0
- Tokenizers: 0.15.1
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | {"base_model": "sentence-transformers/paraphrase-mpnet-base-v2", "library_name": "setfit", "metrics": ["accuracy"], "pipeline_tag": "text-classification", "tags": ["setfit", "sentence-transformers", "text-classification", "generated_from_setfit_trainer"], "widget": [{"text": "versace art portfolio up for sale the art collection of murdered fashion designer gianni versace could fetch up to £9m ($17m) when it is auctioned in new york and london later this year. among the pictures for sale are works by roy lichtenstein andy warhol and henri matisse. the collection was housed at versace s six-storey new york townhouse. the 51-year-old designer was shot outside his florida home in 1997 by suspected serial killer andrew cunanan who later killed himself. the auction at sotheby s will feature 45 contemporary impressionist and 19th century paintings. one of the highlights of the sale is roy lichtenstein s blue nude which has been given an estimate of £1.8m ($3.4m). tobias meyer sotheby s worldwide head of contemporary art said: this collection reflects mr versace s wide-ranging taste and impeccable eye and many of the works were commissioned directly from the artists. outstanding later examples from champions of the pop movement such as roy lichtenstein are juxtaposed with masterpieces from the most visible artists of the 1980 s including jean-michel basquiat and the collaborative genius of basquiat and warhol as well as francesco clemente. much of the collection will be offered for sale at three auctions in new york in june with smaller contemporary paintings going under the hammer in london on 22 and 23 june. a sale of versace s furniture and artworks sold in 2001fetched £5.5m ($10.3m)."}, {"text": "councils prepare to set tax rises council tax in scotland is set to rise by an average of about 4% in the coming year bbc scotland has learned. authorities will decide final figures on thursday when projected increases will be more than twice the rate of inflation which is currently 1.6%. the finance minister has urged councils to limit increases but they have warned that they will struggle to maintain services unless funding is increased. they say much additional government money is for new initiatives. scottish finance minister tom mccabe msp said: last week in parliament i announced an additional £419m for core expenditure to local government in scotland. that s a 5.5% increase and sits against an inflation rate of 1.6% so i think we have quite rightly said to councils this year that we would at the very least ask them to exercise restraint. mr mccabe is also looking for local authorities to become more efficient and save money in coming years. he told bbc radio scotland s sunday live programme: here in scotland we have 32 councils who all have their own individual collection systems for council tax they have their own payroll systems and their own human resource systems. we think there has to be opportunities there for rationalisation and using the money saved to reinvest in frontline services. the councils umbrella organisation cosla which provided bbc scotland with the indicative figures for next year warned that councils would face a continuous struggle to maintain services. mr mccabe has promised them about £8.1bn next year. however most of the increase is targeted to new initiatives and councils will experience difficulties in maintaining core services a cosla spokesman said. cosla says that it is willing to work with the executive on finding efficiency savings but that these will not be enough to maintain services. they say the funding plans for the next three years will see councils lose more of the share of public spending. the conservatives accuse the scottish executive of using the council tax to raise funds because it is too afraid to raise income tax. the tory finance spokesman brian monteith msp said: its a form of disguise... yet again we see that council tax is being used as a way of passing on costs. scared of actually using its three pence income tax that it could put up what we ve seen over the years is more and more burdens being put onto local authorities and the council tax payer having to pick up the bill. there are also warnings that unless funding to councils is increased in the next few years then services may have to be reduced. linda knox director of the scottish local authority management centre at strathclyde university said: with this current settlement the increase is slowing. at the same time the burdens on councils are greater than they were. the settlement figures don t include pay increases and the executive is also requiring a substantial figure - in the area of £325m - in efficiency savings across the settlement period. education will be protected from any cuts but linda knox says this will mean other services will suffer. she said: in practice that will mean a 4-5% cut for other services. on the face of it the settlement looks like an increase of about 9.7% but by the time you take into account other factors its probably only about 1% in real terms."}, {"text": "gadget show heralds mp3 christmas partners of those who love their hi-tech gear may want to get their presents in early as experts predict a gadget shortage this christmas. with apple s ipod topping wish lists again there may not be enough ipod minis to go round predicts oliver irish editor of gadget magazine stuff. the ipod mini is likely to be this year s tracey island said mr irish. stuff has compiled a list of the top 10 gadgets for 2004 and the ipod is at number one. for anyone bewildered by the choice of gadgets on the market stuff and what hi-fi are hosting a best-of gadget show in london this weekend. star of the show will be sony s qrio robot an all-singing all-dancing football-playing man-machine who can even hold intelligent conversations. but he is not for sale and sony has no commercial plans for the robot. he will greet visitors and is flying in from japan. he probably has his own airplane seat that is how highly sony prize him said mr irish. also on display will be a virtual keyboard which projects itself onto any flat surface. the event will play host to a large collection of digital music players from companies such as creative sony and philips as well as the ubiquitously fashionable ipod from apple. suggestions that it could be a gaming or wireless christmas are unlikely to come true as mp3 players remain the most popular stocking filler said mr irish. demand is huge and apple has promised that it can supply enough but people might struggle to get their hands on ipod minis said mr irish. for those who like their gadgets to be multi-talented the gizmondo a powerful gaming console with gps and gprs that also doubles up as an mp3 player movie player and camera could be a must-have. what is impressive is how much it can do and how well it can do them said mr irish. this christmas gadgets will not be an all-male preserve. women will be getting gadgets from husbands and boyfriends as well as buying them for themselves said mr irish. gadgets nowadays are lifestyle products rather than just for geeks."}, {"text": "virus poses as christmas e-mail security firms are warning about a windows virus disguising itself as an electronic christmas card. the zafi.d virus translates the christmas greeting on its subject line into the language of the person receiving infected e-mail. anti-virus firms speculate that this multilingual ability is helping the malicious program spread widely online. anti-virus firm sophos said that 10% of the e-mail currently on the net was infected with the zafi virus. like many other windows viruses zafi-d plunders microsoft outlook for e-mail addresses and then uses mail-sending software to despatch itself across the web to new victims. to be infected users must open up the attachment travelling with the message which bears the code for the malicious bug. the attachment on the e-mail poses as an electronic christmas card but anyone opening it will simply get a crude image of two smiley faces. the virus subject line says merry christmas and translates this into one of 15 languages depending of the final suffix of the e-mail address the infected message has been sent to. the message in the body of the e-mail reads: happy holidays and this too is translated. on infected machines the virus tries to disable anti-virus and firewall software and opens up a backdoor on the pc to hand over control to the writer of the virus. the virus is thought to have spread most widely in south america italy spain bulgaria and hungary. the original zafi virus appeared in april this year. we have seen these hoaxes for several christmases already and personally i prefer traditional pen and paper cards and we recommend this to all our clients too said mikko hypponen who heads f-secure s anti-virus team."}, {"text": "desailly backs blues revenge trip marcel desailly insists there is no chance of history repeating itself when chelsea take on barcelona on wednesday. the french star was part of the chelsea side crushed 5-1 at the nou camp in the champions league quarter-final second leg in 2000. things will be totally different this time he told bbc sport. now everyone knows about chelsea and is a little bit afraid of them. they are one of the major clubs in europe and the pressure will be on barcelona. chelsea have not played barcelona since that quarter-final tie five years ago. the blues had looked destined to progress after winning the first leg at stamford bridge 3-1 courtesy of two goals from tore andre flo and one by gianfranco zola. but they collapsed in the second leg going down to strikes from rivaldo (2) luis figo dani and patrick kluivert. former chelsea captain desailly who is now playing for al-gharafa in qatar says there is no comparison between that side and the current blues team who are top of the premiership. mentally they are much stronger even though a lot of their players are young the 36-year-old said. we made some mistakes at the nou camp in 2000 - a lot of them were individual mistakes. it would not happen now. this team has a new motivation and a different mentality. world cup winner desailly saw huge changes during his time at stamford bridge. he was signed for £4.6m from ac milan in 1998 by ruud gullit and went on to play under gianluca vialli and claudio ranieri. but the biggest change occurred when billionaire roman abramovich bought the club in 2003. desailly says the russian s arrival helped to instil a winning mentality at the club as well as a demand for success. the whole of chelsea is different now - the chairman the manager and all the players he said. everything is new and there is a huge determination to win. since that game in 2000 chelsea have gained more experience in europe and were very close to reaching the champions league final last season. desailly is one of the most decorated players in the history of football. he won the 1998 world cup and 2000 european championship with france the champions league in 1993 with marseilles and 1994 with ac milan two serie a titles and the fa cup in 2000 with chelsea. he is now winding down his career in qatar alongside the likes of frank lebeouf josep guardiola titi camara gabriel batistuta and christophe dugarry. so he is full of admiration for two of his colleagues from the great milan side of the mid-90s who are likely to line up against manchester united on wednesday - paolo maldini and alessandro costacurta. i m happy that they have managed to play so long at a high level he said. i made a vow to costacurta that as long as he plays i will continue to play. and it s amazing that paolo has managed to play at such a high level for such a long time."}], "inference": true, "model-index": [{"name": "SetFit with sentence-transformers/paraphrase-mpnet-base-v2", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "Unknown", "type": "unknown", "split": "test"}, "metrics": [{"type": "accuracy", "value": 0.953, "name": "Accuracy"}]}]}]} |
NickyNicky/StaticEmbedding-MatryoshkaLoss-gemma-2-2b-en-es | NickyNicky | sentence-similarity | [
"sentence-transformers",
"safetensors",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:4322286",
"loss:MatryoshkaLoss",
"loss:MultipleNegativesRankingLoss",
"arxiv:1908.10084",
"arxiv:2205.13147",
"arxiv:1705.00652",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| 2025-01-16T04:07:42 | 2025-01-22T03:44:46 | 0 | 2 | ---
library_name: sentence-transformers
license: apache-2.0
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:4322286
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: how to sign legal documents as power of attorney?
sentences:
- 'After the principal''s name, write “by” and then sign your own name. Under or
after the signature line, indicate your status as POA by including any of the
following identifiers: as POA, as Agent, as Attorney in Fact or as Power of Attorney.'
- '[''From the Home screen, swipe left to Apps.'', ''Tap Transfer my Data.'', ''Tap
Menu (...).'', ''Tap Export to SD card.'']'
- Ginger Dank Nugs (Grape) - 350mg. Feast your eyes on these unique and striking
gourmet chocolates; Coco Nugs created by Ginger Dank. Crafted to resemble perfect
nugs of cannabis, each of the 10 buds contains 35mg of THC. ... This is a perfect
product for both cannabis and chocolate lovers, who appreciate a little twist.
- source_sentence: how to delete vdom in fortigate?
sentences:
- Go to System -> VDOM -> VDOM2 and select 'Delete'. This VDOM is now successfully
removed from the configuration.
- 'Both combination birth control pills and progestin-only pills may cause headaches
as a side effect. Additional side effects of birth control pills may include:
breast tenderness. nausea.'
- White cheese tends to show imperfections more readily and as consumers got more
used to yellow-orange cheese, it became an expected option. Today, many cheddars
are yellow. While most cheesemakers use annatto, some use an artificial coloring
agent instead, according to Sachs.
- source_sentence: where are earthquakes most likely to occur on earth?
sentences:
- Zelle in the Bank of the America app is a fast, safe, and easy way to send and
receive money with family and friends who have a bank account in the U.S., all
with no fees. Money moves in minutes directly between accounts that are already
enrolled with Zelle.
- It takes about 3 days for a spacecraft to reach the Moon. During that time a spacecraft
travels at least 240,000 miles (386,400 kilometers) which is the distance between
Earth and the Moon.
- Most earthquakes occur along the edge of the oceanic and continental plates. The
earth's crust (the outer layer of the planet) is made up of several pieces, called
plates. The plates under the oceans are called oceanic plates and the rest are
continental plates.
- source_sentence: fix iphone is disabled connect to itunes without itunes?
sentences:
- To fix a disabled iPhone or iPad without iTunes, you have to erase your device.
Click on the "Erase iPhone" option and confirm your selection. Wait for a while
as the "Find My iPhone" feature will remotely erase your iOS device. Needless
to say, it will also disable its lock.
- How Māui brought fire to the world. One evening, after eating a hearty meal, Māui
lay beside his fire staring into the flames. ... In the middle of the night, while
everyone was sleeping, Māui went from village to village and extinguished all
the fires until not a single fire burned in the world.
- Angry Orchard makes a variety of year-round craft cider styles, including Angry
Orchard Crisp Apple, a fruit-forward hard cider that balances the sweetness of
culinary apples with dryness and bright acidity of bittersweet apples for a complex,
refreshing taste.
- source_sentence: how to reverse a video on tiktok that's not yours?
sentences:
- '[''Tap "Effects" at the bottom of your screen — it\''s an icon that looks like
a clock. Open the Effects menu. ... '', ''At the end of the new list that appears,
tap "Time." Select "Time" at the end. ... '', ''Select "Reverse" — you\''ll then
see a preview of your new, reversed video appear on the screen.'']'
- Franchise Facts Poke Bar has a franchise fee of up to $30,000, with a total initial
investment range of $157,800 to $438,000. The initial cost of a franchise includes
several fees -- Unlock this franchise to better understand the costs such as training
and territory fees.
- Relative age is the age of a rock layer (or the fossils it contains) compared
to other layers. It can be determined by looking at the position of rock layers.
Absolute age is the numeric age of a layer of rocks or fossils. Absolute age can
be determined by using radiometric dating.
---
<!--
### Nicko colab de pruebas fine tune.
https://colab.research.google.com/drive/1IbcgP-KT01-5csBBB-SJ6kMiI1Udbokt#scrollTo=XgNQ1C1wWbTg&uniqifier=1
-->
# SentenceTransformer
This is a [sentence-transformers](https://www.SBERT.net) model trained. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
<!-- - **Base model:** [Unknown](https://huggingface.co/unknown) -->
- **Maximum Sequence Length:** inf tokens
- **Output Dimensionality:** 1024 dimensions
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): StaticEmbedding(
(embedding): EmbeddingBag(256000, 1024, mode='mean')
)
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("NickyNicky/StaticEmbedding-MatryoshkaLoss-gemma-2-2b-en-es")
# Run inference
sentences = [
"how to reverse a video on tiktok that's not yours?",
'[\'Tap "Effects" at the bottom of your screen — it\\\'s an icon that looks like a clock. Open the Effects menu. ... \', \'At the end of the new list that appears, tap "Time." Select "Time" at the end. ... \', \'Select "Reverse" — you\\\'ll then see a preview of your new, reversed video appear on the screen.\']',
'Relative age is the age of a rock layer (or the fossils it contains) compared to other layers. It can be determined by looking at the position of rock layers. Absolute age is the numeric age of a layer of rocks or fossils. Absolute age can be determined by using radiometric dating.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 4,322,286 training samples english and spanish [dataset news, QA, summary,news cryptocurrency].
* Columns: <code>question</code> and <code>answer</code>
* Approximate statistics based on the first 1000 samples:
| | question | answer |
|:--------|:-----------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 18 characters</li><li>mean: 43.23 characters</li><li>max: 96 characters</li></ul> | <ul><li>min: 55 characters</li><li>mean: 253.36 characters</li><li>max: 371 characters</li></ul> |
* Samples:
| question | answer |
|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>what is the difference between broilers and layers?</code> | <code>An egg laying poultry is called egger or layer whereas broilers are reared for obtaining meat. So a layer should be able to produce more number of large sized eggs, without growing too much. On the other hand, a broiler should yield more meat and hence should be able to grow well.</code> |
| <code>what is the difference between chronological order and spatial order?</code> | <code>As a writer, you should always remember that unlike chronological order and the other organizational methods for data, spatial order does not take into account the time. Spatial order is primarily focused on the location. All it does is take into account the location of objects and not the time.</code> |
| <code>is kamagra same as viagra?</code> | <code>Kamagra is thought to contain the same active ingredient as Viagra, sildenafil citrate. In theory, it should work in much the same way as Viagra, taking about 45 minutes to take effect, and lasting for around 4-6 hours. However, this will vary from person to person.</code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
1024,
768,
512,
256,
128,
64,
32
],
"matryoshka_weights": [
1,
1,
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Evaluation Dataset
#### Unnamed Dataset
* Size: 10,005 evaluation samples
* Columns: <code>question</code> and <code>answer</code>
* Approximate statistics based on the first 1000 samples:
| | question | answer |
|:--------|:-----------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 18 characters</li><li>mean: 43.17 characters</li><li>max: 98 characters</li></ul> | <ul><li>min: 51 characters</li><li>mean: 254.12 characters</li><li>max: 360 characters</li></ul> |
* Samples:
| question | answer |
|:-----------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>how do i program my directv remote with my tv?</code> | <code>['Press MENU on your remote.', 'Select Settings & Help > Settings > Remote Control > Program Remote.', 'Choose the device (TV, audio, DVD) you wish to program. ... ', 'Follow the on-screen prompts to complete programming.']</code> |
| <code>are rodrigues fruit bats nocturnal?</code> | <code>Before its numbers were threatened by habitat destruction, storms, and hunting, some of those groups could number 500 or more members. Sunrise, sunset. Rodrigues fruit bats are most active at dawn, at dusk, and at night.</code> |
| <code>why does your heart rate increase during exercise bbc bitesize?</code> | <code>During exercise there is an increase in physical activity and muscle cells respire more than they do when the body is at rest. The heart rate increases during exercise. The rate and depth of breathing increases - this makes sure that more oxygen is absorbed into the blood, and more carbon dioxide is removed from it.</code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
1024,
768,
512,
256,
128,
64,
32
],
"matryoshka_weights": [
1,
1,
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 2048
- `per_device_eval_batch_size`: 2048
- `learning_rate`: 0.2
- `warmup_ratio`: 0.1
- `bf16`: True
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 2048
- `per_device_eval_batch_size`: 2048
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 0.2
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 3
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: True
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | Validation Loss |
|:------:|:----:|:-------------:|:---------------:|
| 0.0005 | 1 | 49.8746 | - |
| 0.0474 | 100 | 35.8567 | 7.1776 |
| 0.0947 | 200 | 13.988 | 3.2848 |
| 0.1421 | 300 | 8.0009 | 2.3610 |
| 0.1895 | 400 | 6.3293 | 2.0293 |
| 0.2369 | 500 | 5.6296 | 1.8849 |
| 0.2842 | 600 | 5.238 | 1.7495 |
| 0.3316 | 700 | 4.9115 | 1.6694 |
| 0.3790 | 800 | 4.5779 | 1.5583 |
| 0.4263 | 900 | 4.2608 | 1.4784 |
| 0.4737 | 1000 | 4.0893 | 1.4020 |
| 0.5211 | 1100 | 3.8669 | 1.3426 |
| 0.5685 | 1200 | 3.7505 | 1.3160 |
| 0.6158 | 1300 | 3.6529 | 1.2822 |
| 0.6632 | 1400 | 3.5203 | 1.2612 |
| 0.7106 | 1500 | 5.1906 | 1.4469 |
| 0.7579 | 1600 | 4.0273 | 1.6219 |
| 0.8053 | 1700 | 4.8308 | 3.1338 |
| 0.8527 | 1800 | 0.5336 | 3.2854 |
| 0.9000 | 1900 | 0.3 | 3.3757 |
| 0.9474 | 2000 | 0.0886 | 3.3620 |
| 0.9948 | 2100 | 0.0817 | 3.3510 |
| 1.0417 | 2200 | 4.0692 | 1.3638 |
### Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.3.1
- Transformers: 4.47.1
- PyTorch: 2.5.1+cu121
- Accelerate: 1.2.1
- Datasets: 3.2.0
- Tokenizers: 0.21.0
## NanoBEIREvaluator > 0.8
```
{
"NanoDBPedia_cosine_accuracy@3": 0.86,
"NanoDBPedia_cosine_accuracy@5": 0.92,
"NanoDBPedia_cosine_accuracy@10": 0.96,
"NanoFEVER_cosine_accuracy@3": 0.86,
"NanoFEVER_cosine_accuracy@5": 0.92,
"NanoFEVER_cosine_accuracy@10": 0.96,
"NanoQuoraRetrieval_cosine_accuracy@1": 0.88,
"NanoQuoraRetrieval_cosine_accuracy@3": 0.96,
"NanoQuoraRetrieval_cosine_accuracy@5": 1.0,
"NanoQuoraRetrieval_cosine_accuracy@10": 1.0,
"NanoSCIDOCS_cosine_accuracy@5": 0.82,
"NanoSCIDOCS_cosine_accuracy@10": 0.92,
"NanoArguAna_cosine_accuracy@10": 0.92,
"NanoSciFact_cosine_accuracy@10": 0.88,
"NanoHotpotQA_cosine_accuracy@10": 0.88,
"NanoTouche2020_cosine_accuracy@5": 0.9183673469387755,
"NanoTouche2020_cosine_accuracy@10": 0.9387755102040817,
"NanoBEIR_mean_cosine_accuracy@10": 0.8583673469387756
}
```
## All NanoBEIREvaluator
```
{'NanoClimateFEVER_cosine_accuracy@1': 0.28,
'NanoClimateFEVER_cosine_accuracy@3': 0.44,
'NanoClimateFEVER_cosine_accuracy@5': 0.54,
'NanoClimateFEVER_cosine_accuracy@10': 0.72,
'NanoClimateFEVER_cosine_precision@1': 0.28,
'NanoClimateFEVER_cosine_precision@3': 0.15333333333333332,
'NanoClimateFEVER_cosine_precision@5': 0.124,
'NanoClimateFEVER_cosine_precision@10': 0.08999999999999998,
'NanoClimateFEVER_cosine_recall@1': 0.145,
'NanoClimateFEVER_cosine_recall@3': 0.205,
'NanoClimateFEVER_cosine_recall@5': 0.264,
'NanoClimateFEVER_cosine_recall@10': 0.36200000000000004,
'NanoClimateFEVER_cosine_ndcg@10': 0.2957527689242254,
'NanoClimateFEVER_cosine_mrr@10': 0.3996666666666668,
'NanoClimateFEVER_cosine_map@100': 0.23258384801937396,
'NanoDBPedia_cosine_accuracy@1': 0.68,
'NanoDBPedia_cosine_accuracy@3': 0.86,
'NanoDBPedia_cosine_accuracy@5': 0.92,
'NanoDBPedia_cosine_accuracy@10': 0.96,
'NanoDBPedia_cosine_precision@1': 0.68,
'NanoDBPedia_cosine_precision@3': 0.56,
'NanoDBPedia_cosine_precision@5': 0.5120000000000001,
'NanoDBPedia_cosine_precision@10': 0.43800000000000006,
'NanoDBPedia_cosine_recall@1': 0.07601531530835434,
'NanoDBPedia_cosine_recall@3': 0.1438904710839341,
'NanoDBPedia_cosine_recall@5': 0.20681359525684506,
'NanoDBPedia_cosine_recall@10': 0.319966975132044,
'NanoDBPedia_cosine_ndcg@10': 0.5501100350453579,
'NanoDBPedia_cosine_mrr@10': 0.7855000000000001,
'NanoDBPedia_cosine_map@100': 0.39476156890024533,
'NanoFEVER_cosine_accuracy@1': 0.68,
'NanoFEVER_cosine_accuracy@3': 0.86,
'NanoFEVER_cosine_accuracy@5': 0.92,
'NanoFEVER_cosine_accuracy@10': 0.96,
'NanoFEVER_cosine_precision@1': 0.68,
'NanoFEVER_cosine_precision@3': 0.29333333333333333,
'NanoFEVER_cosine_precision@5': 0.19199999999999995,
'NanoFEVER_cosine_precision@10': 0.10199999999999998,
'NanoFEVER_cosine_recall@1': 0.6266666666666666,
'NanoFEVER_cosine_recall@3': 0.8133333333333332,
'NanoFEVER_cosine_recall@5': 0.8833333333333333,
'NanoFEVER_cosine_recall@10': 0.9233333333333333,
'NanoFEVER_cosine_ndcg@10': 0.7933479848498471,
'NanoFEVER_cosine_mrr@10': 0.7780793650793651,
'NanoFEVER_cosine_map@100': 0.7406571665049926,
'NanoFiQA2018_cosine_accuracy@1': 0.46,
'NanoFiQA2018_cosine_accuracy@3': 0.64,
'NanoFiQA2018_cosine_accuracy@5': 0.7,
'NanoFiQA2018_cosine_accuracy@10': 0.72,
'NanoFiQA2018_cosine_precision@1': 0.46,
'NanoFiQA2018_cosine_precision@3': 0.2866666666666666,
'NanoFiQA2018_cosine_precision@5': 0.22399999999999998,
'NanoFiQA2018_cosine_precision@10': 0.12999999999999998,
'NanoFiQA2018_cosine_recall@1': 0.23924603174603173,
'NanoFiQA2018_cosine_recall@3': 0.4251031746031746,
'NanoFiQA2018_cosine_recall@5': 0.5099603174603174,
'NanoFiQA2018_cosine_recall@10': 0.566015873015873,
'NanoFiQA2018_cosine_ndcg@10': 0.4774545077577204,
'NanoFiQA2018_cosine_mrr@10': 0.5475555555555556,
'NanoFiQA2018_cosine_map@100': 0.4125452702654584,
'NanoHotpotQA_cosine_accuracy@1': 0.64,
'NanoHotpotQA_cosine_accuracy@3': 0.82,
'NanoHotpotQA_cosine_accuracy@5': 0.84,
'NanoHotpotQA_cosine_accuracy@10': 0.88,
'NanoHotpotQA_cosine_precision@1': 0.64,
'NanoHotpotQA_cosine_precision@3': 0.3533333333333333,
'NanoHotpotQA_cosine_precision@5': 0.23599999999999993,
'NanoHotpotQA_cosine_precision@10': 0.128,
'NanoHotpotQA_cosine_recall@1': 0.32,
'NanoHotpotQA_cosine_recall@3': 0.53,
'NanoHotpotQA_cosine_recall@5': 0.59,
'NanoHotpotQA_cosine_recall@10': 0.64,
'NanoHotpotQA_cosine_ndcg@10': 0.5959681682828366,
'NanoHotpotQA_cosine_mrr@10': 0.723888888888889,
'NanoHotpotQA_cosine_map@100': 0.5262469568756968,
'NanoMSMARCO_cosine_accuracy@1': 0.36,
'NanoMSMARCO_cosine_accuracy@3': 0.52,
'NanoMSMARCO_cosine_accuracy@5': 0.58,
'NanoMSMARCO_cosine_accuracy@10': 0.8,
'NanoMSMARCO_cosine_precision@1': 0.36,
'NanoMSMARCO_cosine_precision@3': 0.1733333333333333,
'NanoMSMARCO_cosine_precision@5': 0.11599999999999999,
'NanoMSMARCO_cosine_precision@10': 0.08,
'NanoMSMARCO_cosine_recall@1': 0.36,
'NanoMSMARCO_cosine_recall@3': 0.52,
'NanoMSMARCO_cosine_recall@5': 0.58,
'NanoMSMARCO_cosine_recall@10': 0.8,
'NanoMSMARCO_cosine_ndcg@10': 0.5539831330912274,
'NanoMSMARCO_cosine_mrr@10': 0.47960317460317464,
'NanoMSMARCO_cosine_map@100': 0.4907628900864195,
'NanoNFCorpus_cosine_accuracy@1': 0.42,
'NanoNFCorpus_cosine_accuracy@3': 0.56,
'NanoNFCorpus_cosine_accuracy@5': 0.6,
'NanoNFCorpus_cosine_accuracy@10': 0.7,
'NanoNFCorpus_cosine_precision@1': 0.42,
'NanoNFCorpus_cosine_precision@3': 0.3466666666666666,
'NanoNFCorpus_cosine_precision@5': 0.32800000000000007,
'NanoNFCorpus_cosine_precision@10': 0.286,
'NanoNFCorpus_cosine_recall@1': 0.03391318439564492,
'NanoNFCorpus_cosine_recall@3': 0.06311668492872162,
'NanoNFCorpus_cosine_recall@5': 0.08191277059586696,
'NanoNFCorpus_cosine_recall@10': 0.13476845853527392,
'NanoNFCorpus_cosine_ndcg@10': 0.3322933792371396,
'NanoNFCorpus_cosine_mrr@10': 0.4983333333333333,
'NanoNFCorpus_cosine_map@100': 0.13985354018581944,
'NanoNQ_cosine_accuracy@1': 0.44,
'NanoNQ_cosine_accuracy@3': 0.64,
'NanoNQ_cosine_accuracy@5': 0.66,
'NanoNQ_cosine_accuracy@10': 0.76,
'NanoNQ_cosine_precision@1': 0.44,
'NanoNQ_cosine_precision@3': 0.22,
'NanoNQ_cosine_precision@5': 0.14,
'NanoNQ_cosine_precision@10': 0.08199999999999999,
'NanoNQ_cosine_recall@1': 0.42,
'NanoNQ_cosine_recall@3': 0.62,
'NanoNQ_cosine_recall@5': 0.64,
'NanoNQ_cosine_recall@10': 0.75,
'NanoNQ_cosine_ndcg@10': 0.5903874296113161,
'NanoNQ_cosine_mrr@10': 0.5456349206349206,
'NanoNQ_cosine_map@100': 0.5437440035864959,
'NanoQuoraRetrieval_cosine_accuracy@1': 0.88,
'NanoQuoraRetrieval_cosine_accuracy@3': 0.96,
'NanoQuoraRetrieval_cosine_accuracy@5': 1.0,
'NanoQuoraRetrieval_cosine_accuracy@10': 1.0,
'NanoQuoraRetrieval_cosine_precision@1': 0.88,
'NanoQuoraRetrieval_cosine_precision@3': 0.3933333333333333,
'NanoQuoraRetrieval_cosine_precision@5': 0.256,
'NanoQuoraRetrieval_cosine_precision@10': 0.13599999999999998,
'NanoQuoraRetrieval_cosine_recall@1': 0.784,
'NanoQuoraRetrieval_cosine_recall@3': 0.9186666666666667,
'NanoQuoraRetrieval_cosine_recall@5': 0.976,
'NanoQuoraRetrieval_cosine_recall@10': 0.9933333333333334,
'NanoQuoraRetrieval_cosine_ndcg@10': 0.9367841595958026,
'NanoQuoraRetrieval_cosine_mrr@10': 0.9246666666666666,
'NanoQuoraRetrieval_cosine_map@100': 0.913554834054834,
'NanoSCIDOCS_cosine_accuracy@1': 0.52,
'NanoSCIDOCS_cosine_accuracy@3': 0.68,
'NanoSCIDOCS_cosine_accuracy@5': 0.82,
'NanoSCIDOCS_cosine_accuracy@10': 0.92,
'NanoSCIDOCS_cosine_precision@1': 0.52,
'NanoSCIDOCS_cosine_precision@3': 0.3933333333333333,
'NanoSCIDOCS_cosine_precision@5': 0.33599999999999997,
'NanoSCIDOCS_cosine_precision@10': 0.21600000000000003,
'NanoSCIDOCS_cosine_recall@1': 0.10966666666666666,
'NanoSCIDOCS_cosine_recall@3': 0.24466666666666664,
'NanoSCIDOCS_cosine_recall@5': 0.34566666666666657,
'NanoSCIDOCS_cosine_recall@10': 0.44266666666666665,
'NanoSCIDOCS_cosine_ndcg@10': 0.4328110226758414,
'NanoSCIDOCS_cosine_mrr@10': 0.6317222222222222,
'NanoSCIDOCS_cosine_map@100': 0.34997841607847063,
'NanoArguAna_cosine_accuracy@1': 0.2,
'NanoArguAna_cosine_accuracy@3': 0.56,
'NanoArguAna_cosine_accuracy@5': 0.76,
'NanoArguAna_cosine_accuracy@10': 0.92,
'NanoArguAna_cosine_precision@1': 0.2,
'NanoArguAna_cosine_precision@3': 0.18666666666666668,
'NanoArguAna_cosine_precision@5': 0.15200000000000002,
'NanoArguAna_cosine_precision@10': 0.092,
'NanoArguAna_cosine_recall@1': 0.2,
'NanoArguAna_cosine_recall@3': 0.56,
'NanoArguAna_cosine_recall@5': 0.76,
'NanoArguAna_cosine_recall@10': 0.92,
'NanoArguAna_cosine_ndcg@10': 0.5499071039525992,
'NanoArguAna_cosine_mrr@10': 0.43229365079365073,
'NanoArguAna_cosine_map@100': 0.43523820792684886,
'NanoSciFact_cosine_accuracy@1': 0.6,
'NanoSciFact_cosine_accuracy@3': 0.72,
'NanoSciFact_cosine_accuracy@5': 0.8,
'NanoSciFact_cosine_accuracy@10': 0.88,
'NanoSciFact_cosine_precision@1': 0.6,
'NanoSciFact_cosine_precision@3': 0.25333333333333335,
'NanoSciFact_cosine_precision@5': 0.18,
'NanoSciFact_cosine_precision@10': 0.09799999999999999,
'NanoSciFact_cosine_recall@1': 0.58,
'NanoSciFact_cosine_recall@3': 0.7,
'NanoSciFact_cosine_recall@5': 0.8,
'NanoSciFact_cosine_recall@10': 0.87,
'NanoSciFact_cosine_ndcg@10': 0.7265348054031264,
'NanoSciFact_cosine_mrr@10': 0.6841031746031746,
'NanoSciFact_cosine_map@100': 0.6810233866101422,
'NanoTouche2020_cosine_accuracy@1': 0.5102040816326531,
'NanoTouche2020_cosine_accuracy@3': 0.8367346938775511,
'NanoTouche2020_cosine_accuracy@5': 0.9183673469387755,
'NanoTouche2020_cosine_accuracy@10': 0.9387755102040817,
'NanoTouche2020_cosine_precision@1': 0.5102040816326531,
'NanoTouche2020_cosine_precision@3': 0.5374149659863945,
'NanoTouche2020_cosine_precision@5': 0.5061224489795918,
'NanoTouche2020_cosine_precision@10': 0.43265306122448977,
'NanoTouche2020_cosine_recall@1': 0.03546508562664911,
'NanoTouche2020_cosine_recall@3': 0.11189238805791148,
'NanoTouche2020_cosine_recall@5': 0.1673503566176574,
'NanoTouche2020_cosine_recall@10': 0.2818808841266296,
'NanoTouche2020_cosine_ndcg@10': 0.47479704449085264,
'NanoTouche2020_cosine_mrr@10': 0.6714285714285714,
'NanoTouche2020_cosine_map@100': 0.3438320372291555,
'NanoBEIR_mean_cosine_accuracy@1': 0.5130926216640502,
'NanoBEIR_mean_cosine_accuracy@3': 0.6997488226059654,
'NanoBEIR_mean_cosine_accuracy@5': 0.7737205651491367,
'NanoBEIR_mean_cosine_accuracy@10': 0.8583673469387756,
'NanoBEIR_mean_cosine_precision@1': 0.5130926216640502,
'NanoBEIR_mean_cosine_precision@3': 0.31928833071690216,
'NanoBEIR_mean_cosine_precision@5': 0.2540094191522763,
'NanoBEIR_mean_cosine_precision@10': 0.1777425431711146,
'NanoBEIR_mean_cosine_recall@1': 0.302305611570001,
'NanoBEIR_mean_cosine_recall@3': 0.4504361065646467,
'NanoBEIR_mean_cosine_recall@5': 0.5234643876869758,
'NanoBEIR_mean_cosine_recall@10': 0.6156896557033196,
'NanoBEIR_mean_cosine_ndcg@10': 0.5623178109936842,
'NanoBEIR_mean_cosine_mrr@10': 0.6232673992673993,
'NanoBEIR_mean_cosine_map@100': 0.47729093279415025}
```
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MatryoshkaLoss
```bibtex
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | [
"TEXT_CLASSIFICATION"
]
| [
"CRAFT"
]
| Non_BioNLP |
<!--
### Nicko colab de pruebas fine tune.
https://colab.research.google.com/drive/1IbcgP-KT01-5csBBB-SJ6kMiI1Udbokt#scrollTo=XgNQ1C1wWbTg&uniqifier=1
-->
# SentenceTransformer
This is a [sentence-transformers](https://www.SBERT.net) model trained. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
<!-- - **Base model:** [Unknown](https://huggingface.co/unknown) -->
- **Maximum Sequence Length:** inf tokens
- **Output Dimensionality:** 1024 dimensions
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): StaticEmbedding(
(embedding): EmbeddingBag(256000, 1024, mode='mean')
)
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("NickyNicky/StaticEmbedding-MatryoshkaLoss-gemma-2-2b-en-es")
# Run inference
sentences = [
"how to reverse a video on tiktok that's not yours?",
'[\'Tap "Effects" at the bottom of your screen — it\\\'s an icon that looks like a clock. Open the Effects menu. ... \', \'At the end of the new list that appears, tap "Time." Select "Time" at the end. ... \', \'Select "Reverse" — you\\\'ll then see a preview of your new, reversed video appear on the screen.\']',
'Relative age is the age of a rock layer (or the fossils it contains) compared to other layers. It can be determined by looking at the position of rock layers. Absolute age is the numeric age of a layer of rocks or fossils. Absolute age can be determined by using radiometric dating.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 4,322,286 training samples english and spanish [dataset news, QA, summary,news cryptocurrency].
* Columns: <code>question</code> and <code>answer</code>
* Approximate statistics based on the first 1000 samples:
| | question | answer |
|:--------|:-----------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 18 characters</li><li>mean: 43.23 characters</li><li>max: 96 characters</li></ul> | <ul><li>min: 55 characters</li><li>mean: 253.36 characters</li><li>max: 371 characters</li></ul> |
* Samples:
| question | answer |
|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>what is the difference between broilers and layers?</code> | <code>An egg laying poultry is called egger or layer whereas broilers are reared for obtaining meat. So a layer should be able to produce more number of large sized eggs, without growing too much. On the other hand, a broiler should yield more meat and hence should be able to grow well.</code> |
| <code>what is the difference between chronological order and spatial order?</code> | <code>As a writer, you should always remember that unlike chronological order and the other organizational methods for data, spatial order does not take into account the time. Spatial order is primarily focused on the location. All it does is take into account the location of objects and not the time.</code> |
| <code>is kamagra same as viagra?</code> | <code>Kamagra is thought to contain the same active ingredient as Viagra, sildenafil citrate. In theory, it should work in much the same way as Viagra, taking about 45 minutes to take effect, and lasting for around 4-6 hours. However, this will vary from person to person.</code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
1024,
768,
512,
256,
128,
64,
32
],
"matryoshka_weights": [
1,
1,
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Evaluation Dataset
#### Unnamed Dataset
* Size: 10,005 evaluation samples
* Columns: <code>question</code> and <code>answer</code>
* Approximate statistics based on the first 1000 samples:
| | question | answer |
|:--------|:-----------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 18 characters</li><li>mean: 43.17 characters</li><li>max: 98 characters</li></ul> | <ul><li>min: 51 characters</li><li>mean: 254.12 characters</li><li>max: 360 characters</li></ul> |
* Samples:
| question | answer |
|:-----------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>how do i program my directv remote with my tv?</code> | <code>['Press MENU on your remote.', 'Select Settings & Help > Settings > Remote Control > Program Remote.', 'Choose the device (TV, audio, DVD) you wish to program. ... ', 'Follow the on-screen prompts to complete programming.']</code> |
| <code>are rodrigues fruit bats nocturnal?</code> | <code>Before its numbers were threatened by habitat destruction, storms, and hunting, some of those groups could number 500 or more members. Sunrise, sunset. Rodrigues fruit bats are most active at dawn, at dusk, and at night.</code> |
| <code>why does your heart rate increase during exercise bbc bitesize?</code> | <code>During exercise there is an increase in physical activity and muscle cells respire more than they do when the body is at rest. The heart rate increases during exercise. The rate and depth of breathing increases - this makes sure that more oxygen is absorbed into the blood, and more carbon dioxide is removed from it.</code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
1024,
768,
512,
256,
128,
64,
32
],
"matryoshka_weights": [
1,
1,
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 2048
- `per_device_eval_batch_size`: 2048
- `learning_rate`: 0.2
- `warmup_ratio`: 0.1
- `bf16`: True
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 2048
- `per_device_eval_batch_size`: 2048
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 0.2
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 3
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: True
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | Validation Loss |
|:------:|:----:|:-------------:|:---------------:|
| 0.0005 | 1 | 49.8746 | - |
| 0.0474 | 100 | 35.8567 | 7.1776 |
| 0.0947 | 200 | 13.988 | 3.2848 |
| 0.1421 | 300 | 8.0009 | 2.3610 |
| 0.1895 | 400 | 6.3293 | 2.0293 |
| 0.2369 | 500 | 5.6296 | 1.8849 |
| 0.2842 | 600 | 5.238 | 1.7495 |
| 0.3316 | 700 | 4.9115 | 1.6694 |
| 0.3790 | 800 | 4.5779 | 1.5583 |
| 0.4263 | 900 | 4.2608 | 1.4784 |
| 0.4737 | 1000 | 4.0893 | 1.4020 |
| 0.5211 | 1100 | 3.8669 | 1.3426 |
| 0.5685 | 1200 | 3.7505 | 1.3160 |
| 0.6158 | 1300 | 3.6529 | 1.2822 |
| 0.6632 | 1400 | 3.5203 | 1.2612 |
| 0.7106 | 1500 | 5.1906 | 1.4469 |
| 0.7579 | 1600 | 4.0273 | 1.6219 |
| 0.8053 | 1700 | 4.8308 | 3.1338 |
| 0.8527 | 1800 | 0.5336 | 3.2854 |
| 0.9000 | 1900 | 0.3 | 3.3757 |
| 0.9474 | 2000 | 0.0886 | 3.3620 |
| 0.9948 | 2100 | 0.0817 | 3.3510 |
| 1.0417 | 2200 | 4.0692 | 1.3638 |
### Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.3.1
- Transformers: 4.47.1
- PyTorch: 2.5.1+cu121
- Accelerate: 1.2.1
- Datasets: 3.2.0
- Tokenizers: 0.21.0
## NanoBEIREvaluator > 0.8
```
{
"NanoDBPedia_cosine_accuracy@3": 0.86,
"NanoDBPedia_cosine_accuracy@5": 0.92,
"NanoDBPedia_cosine_accuracy@10": 0.96,
"NanoFEVER_cosine_accuracy@3": 0.86,
"NanoFEVER_cosine_accuracy@5": 0.92,
"NanoFEVER_cosine_accuracy@10": 0.96,
"NanoQuoraRetrieval_cosine_accuracy@1": 0.88,
"NanoQuoraRetrieval_cosine_accuracy@3": 0.96,
"NanoQuoraRetrieval_cosine_accuracy@5": 1.0,
"NanoQuoraRetrieval_cosine_accuracy@10": 1.0,
"NanoSCIDOCS_cosine_accuracy@5": 0.82,
"NanoSCIDOCS_cosine_accuracy@10": 0.92,
"NanoArguAna_cosine_accuracy@10": 0.92,
"NanoSciFact_cosine_accuracy@10": 0.88,
"NanoHotpotQA_cosine_accuracy@10": 0.88,
"NanoTouche2020_cosine_accuracy@5": 0.9183673469387755,
"NanoTouche2020_cosine_accuracy@10": 0.9387755102040817,
"NanoBEIR_mean_cosine_accuracy@10": 0.8583673469387756
}
```
## All NanoBEIREvaluator
```
{'NanoClimateFEVER_cosine_accuracy@1': 0.28,
'NanoClimateFEVER_cosine_accuracy@3': 0.44,
'NanoClimateFEVER_cosine_accuracy@5': 0.54,
'NanoClimateFEVER_cosine_accuracy@10': 0.72,
'NanoClimateFEVER_cosine_precision@1': 0.28,
'NanoClimateFEVER_cosine_precision@3': 0.15333333333333332,
'NanoClimateFEVER_cosine_precision@5': 0.124,
'NanoClimateFEVER_cosine_precision@10': 0.08999999999999998,
'NanoClimateFEVER_cosine_recall@1': 0.145,
'NanoClimateFEVER_cosine_recall@3': 0.205,
'NanoClimateFEVER_cosine_recall@5': 0.264,
'NanoClimateFEVER_cosine_recall@10': 0.36200000000000004,
'NanoClimateFEVER_cosine_ndcg@10': 0.2957527689242254,
'NanoClimateFEVER_cosine_mrr@10': 0.3996666666666668,
'NanoClimateFEVER_cosine_map@100': 0.23258384801937396,
'NanoDBPedia_cosine_accuracy@1': 0.68,
'NanoDBPedia_cosine_accuracy@3': 0.86,
'NanoDBPedia_cosine_accuracy@5': 0.92,
'NanoDBPedia_cosine_accuracy@10': 0.96,
'NanoDBPedia_cosine_precision@1': 0.68,
'NanoDBPedia_cosine_precision@3': 0.56,
'NanoDBPedia_cosine_precision@5': 0.5120000000000001,
'NanoDBPedia_cosine_precision@10': 0.43800000000000006,
'NanoDBPedia_cosine_recall@1': 0.07601531530835434,
'NanoDBPedia_cosine_recall@3': 0.1438904710839341,
'NanoDBPedia_cosine_recall@5': 0.20681359525684506,
'NanoDBPedia_cosine_recall@10': 0.319966975132044,
'NanoDBPedia_cosine_ndcg@10': 0.5501100350453579,
'NanoDBPedia_cosine_mrr@10': 0.7855000000000001,
'NanoDBPedia_cosine_map@100': 0.39476156890024533,
'NanoFEVER_cosine_accuracy@1': 0.68,
'NanoFEVER_cosine_accuracy@3': 0.86,
'NanoFEVER_cosine_accuracy@5': 0.92,
'NanoFEVER_cosine_accuracy@10': 0.96,
'NanoFEVER_cosine_precision@1': 0.68,
'NanoFEVER_cosine_precision@3': 0.29333333333333333,
'NanoFEVER_cosine_precision@5': 0.19199999999999995,
'NanoFEVER_cosine_precision@10': 0.10199999999999998,
'NanoFEVER_cosine_recall@1': 0.6266666666666666,
'NanoFEVER_cosine_recall@3': 0.8133333333333332,
'NanoFEVER_cosine_recall@5': 0.8833333333333333,
'NanoFEVER_cosine_recall@10': 0.9233333333333333,
'NanoFEVER_cosine_ndcg@10': 0.7933479848498471,
'NanoFEVER_cosine_mrr@10': 0.7780793650793651,
'NanoFEVER_cosine_map@100': 0.7406571665049926,
'NanoFiQA2018_cosine_accuracy@1': 0.46,
'NanoFiQA2018_cosine_accuracy@3': 0.64,
'NanoFiQA2018_cosine_accuracy@5': 0.7,
'NanoFiQA2018_cosine_accuracy@10': 0.72,
'NanoFiQA2018_cosine_precision@1': 0.46,
'NanoFiQA2018_cosine_precision@3': 0.2866666666666666,
'NanoFiQA2018_cosine_precision@5': 0.22399999999999998,
'NanoFiQA2018_cosine_precision@10': 0.12999999999999998,
'NanoFiQA2018_cosine_recall@1': 0.23924603174603173,
'NanoFiQA2018_cosine_recall@3': 0.4251031746031746,
'NanoFiQA2018_cosine_recall@5': 0.5099603174603174,
'NanoFiQA2018_cosine_recall@10': 0.566015873015873,
'NanoFiQA2018_cosine_ndcg@10': 0.4774545077577204,
'NanoFiQA2018_cosine_mrr@10': 0.5475555555555556,
'NanoFiQA2018_cosine_map@100': 0.4125452702654584,
'NanoHotpotQA_cosine_accuracy@1': 0.64,
'NanoHotpotQA_cosine_accuracy@3': 0.82,
'NanoHotpotQA_cosine_accuracy@5': 0.84,
'NanoHotpotQA_cosine_accuracy@10': 0.88,
'NanoHotpotQA_cosine_precision@1': 0.64,
'NanoHotpotQA_cosine_precision@3': 0.3533333333333333,
'NanoHotpotQA_cosine_precision@5': 0.23599999999999993,
'NanoHotpotQA_cosine_precision@10': 0.128,
'NanoHotpotQA_cosine_recall@1': 0.32,
'NanoHotpotQA_cosine_recall@3': 0.53,
'NanoHotpotQA_cosine_recall@5': 0.59,
'NanoHotpotQA_cosine_recall@10': 0.64,
'NanoHotpotQA_cosine_ndcg@10': 0.5959681682828366,
'NanoHotpotQA_cosine_mrr@10': 0.723888888888889,
'NanoHotpotQA_cosine_map@100': 0.5262469568756968,
'NanoMSMARCO_cosine_accuracy@1': 0.36,
'NanoMSMARCO_cosine_accuracy@3': 0.52,
'NanoMSMARCO_cosine_accuracy@5': 0.58,
'NanoMSMARCO_cosine_accuracy@10': 0.8,
'NanoMSMARCO_cosine_precision@1': 0.36,
'NanoMSMARCO_cosine_precision@3': 0.1733333333333333,
'NanoMSMARCO_cosine_precision@5': 0.11599999999999999,
'NanoMSMARCO_cosine_precision@10': 0.08,
'NanoMSMARCO_cosine_recall@1': 0.36,
'NanoMSMARCO_cosine_recall@3': 0.52,
'NanoMSMARCO_cosine_recall@5': 0.58,
'NanoMSMARCO_cosine_recall@10': 0.8,
'NanoMSMARCO_cosine_ndcg@10': 0.5539831330912274,
'NanoMSMARCO_cosine_mrr@10': 0.47960317460317464,
'NanoMSMARCO_cosine_map@100': 0.4907628900864195,
'NanoNFCorpus_cosine_accuracy@1': 0.42,
'NanoNFCorpus_cosine_accuracy@3': 0.56,
'NanoNFCorpus_cosine_accuracy@5': 0.6,
'NanoNFCorpus_cosine_accuracy@10': 0.7,
'NanoNFCorpus_cosine_precision@1': 0.42,
'NanoNFCorpus_cosine_precision@3': 0.3466666666666666,
'NanoNFCorpus_cosine_precision@5': 0.32800000000000007,
'NanoNFCorpus_cosine_precision@10': 0.286,
'NanoNFCorpus_cosine_recall@1': 0.03391318439564492,
'NanoNFCorpus_cosine_recall@3': 0.06311668492872162,
'NanoNFCorpus_cosine_recall@5': 0.08191277059586696,
'NanoNFCorpus_cosine_recall@10': 0.13476845853527392,
'NanoNFCorpus_cosine_ndcg@10': 0.3322933792371396,
'NanoNFCorpus_cosine_mrr@10': 0.4983333333333333,
'NanoNFCorpus_cosine_map@100': 0.13985354018581944,
'NanoNQ_cosine_accuracy@1': 0.44,
'NanoNQ_cosine_accuracy@3': 0.64,
'NanoNQ_cosine_accuracy@5': 0.66,
'NanoNQ_cosine_accuracy@10': 0.76,
'NanoNQ_cosine_precision@1': 0.44,
'NanoNQ_cosine_precision@3': 0.22,
'NanoNQ_cosine_precision@5': 0.14,
'NanoNQ_cosine_precision@10': 0.08199999999999999,
'NanoNQ_cosine_recall@1': 0.42,
'NanoNQ_cosine_recall@3': 0.62,
'NanoNQ_cosine_recall@5': 0.64,
'NanoNQ_cosine_recall@10': 0.75,
'NanoNQ_cosine_ndcg@10': 0.5903874296113161,
'NanoNQ_cosine_mrr@10': 0.5456349206349206,
'NanoNQ_cosine_map@100': 0.5437440035864959,
'NanoQuoraRetrieval_cosine_accuracy@1': 0.88,
'NanoQuoraRetrieval_cosine_accuracy@3': 0.96,
'NanoQuoraRetrieval_cosine_accuracy@5': 1.0,
'NanoQuoraRetrieval_cosine_accuracy@10': 1.0,
'NanoQuoraRetrieval_cosine_precision@1': 0.88,
'NanoQuoraRetrieval_cosine_precision@3': 0.3933333333333333,
'NanoQuoraRetrieval_cosine_precision@5': 0.256,
'NanoQuoraRetrieval_cosine_precision@10': 0.13599999999999998,
'NanoQuoraRetrieval_cosine_recall@1': 0.784,
'NanoQuoraRetrieval_cosine_recall@3': 0.9186666666666667,
'NanoQuoraRetrieval_cosine_recall@5': 0.976,
'NanoQuoraRetrieval_cosine_recall@10': 0.9933333333333334,
'NanoQuoraRetrieval_cosine_ndcg@10': 0.9367841595958026,
'NanoQuoraRetrieval_cosine_mrr@10': 0.9246666666666666,
'NanoQuoraRetrieval_cosine_map@100': 0.913554834054834,
'NanoSCIDOCS_cosine_accuracy@1': 0.52,
'NanoSCIDOCS_cosine_accuracy@3': 0.68,
'NanoSCIDOCS_cosine_accuracy@5': 0.82,
'NanoSCIDOCS_cosine_accuracy@10': 0.92,
'NanoSCIDOCS_cosine_precision@1': 0.52,
'NanoSCIDOCS_cosine_precision@3': 0.3933333333333333,
'NanoSCIDOCS_cosine_precision@5': 0.33599999999999997,
'NanoSCIDOCS_cosine_precision@10': 0.21600000000000003,
'NanoSCIDOCS_cosine_recall@1': 0.10966666666666666,
'NanoSCIDOCS_cosine_recall@3': 0.24466666666666664,
'NanoSCIDOCS_cosine_recall@5': 0.34566666666666657,
'NanoSCIDOCS_cosine_recall@10': 0.44266666666666665,
'NanoSCIDOCS_cosine_ndcg@10': 0.4328110226758414,
'NanoSCIDOCS_cosine_mrr@10': 0.6317222222222222,
'NanoSCIDOCS_cosine_map@100': 0.34997841607847063,
'NanoArguAna_cosine_accuracy@1': 0.2,
'NanoArguAna_cosine_accuracy@3': 0.56,
'NanoArguAna_cosine_accuracy@5': 0.76,
'NanoArguAna_cosine_accuracy@10': 0.92,
'NanoArguAna_cosine_precision@1': 0.2,
'NanoArguAna_cosine_precision@3': 0.18666666666666668,
'NanoArguAna_cosine_precision@5': 0.15200000000000002,
'NanoArguAna_cosine_precision@10': 0.092,
'NanoArguAna_cosine_recall@1': 0.2,
'NanoArguAna_cosine_recall@3': 0.56,
'NanoArguAna_cosine_recall@5': 0.76,
'NanoArguAna_cosine_recall@10': 0.92,
'NanoArguAna_cosine_ndcg@10': 0.5499071039525992,
'NanoArguAna_cosine_mrr@10': 0.43229365079365073,
'NanoArguAna_cosine_map@100': 0.43523820792684886,
'NanoSciFact_cosine_accuracy@1': 0.6,
'NanoSciFact_cosine_accuracy@3': 0.72,
'NanoSciFact_cosine_accuracy@5': 0.8,
'NanoSciFact_cosine_accuracy@10': 0.88,
'NanoSciFact_cosine_precision@1': 0.6,
'NanoSciFact_cosine_precision@3': 0.25333333333333335,
'NanoSciFact_cosine_precision@5': 0.18,
'NanoSciFact_cosine_precision@10': 0.09799999999999999,
'NanoSciFact_cosine_recall@1': 0.58,
'NanoSciFact_cosine_recall@3': 0.7,
'NanoSciFact_cosine_recall@5': 0.8,
'NanoSciFact_cosine_recall@10': 0.87,
'NanoSciFact_cosine_ndcg@10': 0.7265348054031264,
'NanoSciFact_cosine_mrr@10': 0.6841031746031746,
'NanoSciFact_cosine_map@100': 0.6810233866101422,
'NanoTouche2020_cosine_accuracy@1': 0.5102040816326531,
'NanoTouche2020_cosine_accuracy@3': 0.8367346938775511,
'NanoTouche2020_cosine_accuracy@5': 0.9183673469387755,
'NanoTouche2020_cosine_accuracy@10': 0.9387755102040817,
'NanoTouche2020_cosine_precision@1': 0.5102040816326531,
'NanoTouche2020_cosine_precision@3': 0.5374149659863945,
'NanoTouche2020_cosine_precision@5': 0.5061224489795918,
'NanoTouche2020_cosine_precision@10': 0.43265306122448977,
'NanoTouche2020_cosine_recall@1': 0.03546508562664911,
'NanoTouche2020_cosine_recall@3': 0.11189238805791148,
'NanoTouche2020_cosine_recall@5': 0.1673503566176574,
'NanoTouche2020_cosine_recall@10': 0.2818808841266296,
'NanoTouche2020_cosine_ndcg@10': 0.47479704449085264,
'NanoTouche2020_cosine_mrr@10': 0.6714285714285714,
'NanoTouche2020_cosine_map@100': 0.3438320372291555,
'NanoBEIR_mean_cosine_accuracy@1': 0.5130926216640502,
'NanoBEIR_mean_cosine_accuracy@3': 0.6997488226059654,
'NanoBEIR_mean_cosine_accuracy@5': 0.7737205651491367,
'NanoBEIR_mean_cosine_accuracy@10': 0.8583673469387756,
'NanoBEIR_mean_cosine_precision@1': 0.5130926216640502,
'NanoBEIR_mean_cosine_precision@3': 0.31928833071690216,
'NanoBEIR_mean_cosine_precision@5': 0.2540094191522763,
'NanoBEIR_mean_cosine_precision@10': 0.1777425431711146,
'NanoBEIR_mean_cosine_recall@1': 0.302305611570001,
'NanoBEIR_mean_cosine_recall@3': 0.4504361065646467,
'NanoBEIR_mean_cosine_recall@5': 0.5234643876869758,
'NanoBEIR_mean_cosine_recall@10': 0.6156896557033196,
'NanoBEIR_mean_cosine_ndcg@10': 0.5623178109936842,
'NanoBEIR_mean_cosine_mrr@10': 0.6232673992673993,
'NanoBEIR_mean_cosine_map@100': 0.47729093279415025}
```
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MatryoshkaLoss
```bibtex
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | {"library_name": "sentence-transformers", "license": "apache-2.0", "pipeline_tag": "sentence-similarity", "tags": ["sentence-transformers", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:4322286", "loss:MatryoshkaLoss", "loss:MultipleNegativesRankingLoss"], "widget": [{"source_sentence": "how to sign legal documents as power of attorney?", "sentences": ["After the principal's name, write “by” and then sign your own name. Under or after the signature line, indicate your status as POA by including any of the following identifiers: as POA, as Agent, as Attorney in Fact or as Power of Attorney.", "['From the Home screen, swipe left to Apps.', 'Tap Transfer my Data.', 'Tap Menu (...).', 'Tap Export to SD card.']", "Ginger Dank Nugs (Grape) - 350mg. Feast your eyes on these unique and striking gourmet chocolates; Coco Nugs created by Ginger Dank. Crafted to resemble perfect nugs of cannabis, each of the 10 buds contains 35mg of THC. ... This is a perfect product for both cannabis and chocolate lovers, who appreciate a little twist."]}, {"source_sentence": "how to delete vdom in fortigate?", "sentences": ["Go to System -> VDOM -> VDOM2 and select 'Delete'. This VDOM is now successfully removed from the configuration.", "Both combination birth control pills and progestin-only pills may cause headaches as a side effect. Additional side effects of birth control pills may include: breast tenderness. nausea.", "White cheese tends to show imperfections more readily and as consumers got more used to yellow-orange cheese, it became an expected option. Today, many cheddars are yellow. While most cheesemakers use annatto, some use an artificial coloring agent instead, according to Sachs."]}, {"source_sentence": "where are earthquakes most likely to occur on earth?", "sentences": ["Zelle in the Bank of the America app is a fast, safe, and easy way to send and receive money with family and friends who have a bank account in the U.S., all with no fees. Money moves in minutes directly between accounts that are already enrolled with Zelle.", "It takes about 3 days for a spacecraft to reach the Moon. During that time a spacecraft travels at least 240,000 miles (386,400 kilometers) which is the distance between Earth and the Moon.", "Most earthquakes occur along the edge of the oceanic and continental plates. The earth's crust (the outer layer of the planet) is made up of several pieces, called plates. The plates under the oceans are called oceanic plates and the rest are continental plates."]}, {"source_sentence": "fix iphone is disabled connect to itunes without itunes?", "sentences": ["To fix a disabled iPhone or iPad without iTunes, you have to erase your device. Click on the \"Erase iPhone\" option and confirm your selection. Wait for a while as the \"Find My iPhone\" feature will remotely erase your iOS device. Needless to say, it will also disable its lock.", "How Māui brought fire to the world. One evening, after eating a hearty meal, Māui lay beside his fire staring into the flames. ... In the middle of the night, while everyone was sleeping, Māui went from village to village and extinguished all the fires until not a single fire burned in the world.", "Angry Orchard makes a variety of year-round craft cider styles, including Angry Orchard Crisp Apple, a fruit-forward hard cider that balances the sweetness of culinary apples with dryness and bright acidity of bittersweet apples for a complex, refreshing taste."]}, {"source_sentence": "how to reverse a video on tiktok that's not yours?", "sentences": ["['Tap \"Effects\" at the bottom of your screen — it\\'s an icon that looks like a clock. Open the Effects menu. ... ', 'At the end of the new list that appears, tap \"Time.\" Select \"Time\" at the end. ... ', 'Select \"Reverse\" — you\\'ll then see a preview of your new, reversed video appear on the screen.']", "Franchise Facts Poke Bar has a franchise fee of up to $30,000, with a total initial investment range of $157,800 to $438,000. The initial cost of a franchise includes several fees -- Unlock this franchise to better understand the costs such as training and territory fees.", "Relative age is the age of a rock layer (or the fossils it contains) compared to other layers. It can be determined by looking at the position of rock layers. Absolute age is the numeric age of a layer of rocks or fossils. Absolute age can be determined by using radiometric dating."]}]} |
BAAI/bge-large-zh-v1.5 | BAAI | feature-extraction | [
"sentence-transformers",
"pytorch",
"bert",
"feature-extraction",
"sentence-similarity",
"transformers",
"zh",
"arxiv:2401.03462",
"arxiv:2312.15503",
"arxiv:2311.13534",
"arxiv:2310.07554",
"arxiv:2309.07597",
"license:mit",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
]
| 2023-09-12T05:22:11 | 2024-04-02T14:00:04 | 210,152 | 490 | ---
language:
- zh
license: mit
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
<h1 align="center">FlagEmbedding</h1>
<h4 align="center">
<p>
<a href=#model-list>Model List</a> |
<a href=#frequently-asked-questions>FAQ</a> |
<a href=#usage>Usage</a> |
<a href="#evaluation">Evaluation</a> |
<a href="#train">Train</a> |
<a href="#contact">Contact</a> |
<a href="#citation">Citation</a> |
<a href="#license">License</a>
<p>
</h4>
For more details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding).
If you are looking for a model that supports more languages, longer texts, and other retrieval methods, you can try using [bge-m3](https://huggingface.co/BAAI/bge-m3).
[English](README.md) | [中文](https://github.com/FlagOpen/FlagEmbedding/blob/master/README_zh.md)
FlagEmbedding focuses on retrieval-augmented LLMs, consisting of the following projects currently:
- **Long-Context LLM**: [Activation Beacon](https://github.com/FlagOpen/FlagEmbedding/tree/master/Long_LLM/activation_beacon)
- **Fine-tuning of LM** : [LM-Cocktail](https://github.com/FlagOpen/FlagEmbedding/tree/master/LM_Cocktail)
- **Dense Retrieval**: [BGE-M3](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/BGE_M3), [LLM Embedder](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/llm_embedder), [BGE Embedding](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/baai_general_embedding)
- **Reranker Model**: [BGE Reranker](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker)
- **Benchmark**: [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB)
## News
- 1/30/2024: Release **BGE-M3**, a new member to BGE model series! M3 stands for **M**ulti-linguality (100+ languages), **M**ulti-granularities (input length up to 8192), **M**ulti-Functionality (unification of dense, lexical, multi-vec/colbert retrieval).
It is the first embedding model which supports all three retrieval methods, achieving new SOTA on multi-lingual (MIRACL) and cross-lingual (MKQA) benchmarks.
[Technical Report](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/BGE_M3/BGE_M3.pdf) and [Code](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/BGE_M3). :fire:
- 1/9/2024: Release [Activation-Beacon](https://github.com/FlagOpen/FlagEmbedding/tree/master/Long_LLM/activation_beacon), an effective, efficient, compatible, and low-cost (training) method to extend the context length of LLM. [Technical Report](https://arxiv.org/abs/2401.03462) :fire:
- 12/24/2023: Release **LLaRA**, a LLaMA-7B based dense retriever, leading to state-of-the-art performances on MS MARCO and BEIR. Model and code will be open-sourced. Please stay tuned. [Technical Report](https://arxiv.org/abs/2312.15503) :fire:
- 11/23/2023: Release [LM-Cocktail](https://github.com/FlagOpen/FlagEmbedding/tree/master/LM_Cocktail), a method to maintain general capabilities during fine-tuning by merging multiple language models. [Technical Report](https://arxiv.org/abs/2311.13534) :fire:
- 10/12/2023: Release [LLM-Embedder](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/llm_embedder), a unified embedding model to support diverse retrieval augmentation needs for LLMs. [Technical Report](https://arxiv.org/pdf/2310.07554.pdf)
- 09/15/2023: The [technical report](https://arxiv.org/pdf/2309.07597.pdf) and [massive training data](https://data.baai.ac.cn/details/BAAI-MTP) of BGE has been released
- 09/12/2023: New models:
- **New reranker model**: release cross-encoder models `BAAI/bge-reranker-base` and `BAAI/bge-reranker-large`, which are more powerful than embedding model. We recommend to use/fine-tune them to re-rank top-k documents returned by embedding models.
- **update embedding model**: release `bge-*-v1.5` embedding model to alleviate the issue of the similarity distribution, and enhance its retrieval ability without instruction.
<details>
<summary>More</summary>
<!-- ### More -->
- 09/07/2023: Update [fine-tune code](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md): Add script to mine hard negatives and support adding instruction during fine-tuning.
- 08/09/2023: BGE Models are integrated into **Langchain**, you can use it like [this](#using-langchain); C-MTEB **leaderboard** is [available](https://huggingface.co/spaces/mteb/leaderboard).
- 08/05/2023: Release base-scale and small-scale models, **best performance among the models of the same size 🤗**
- 08/02/2023: Release `bge-large-*`(short for BAAI General Embedding) Models, **rank 1st on MTEB and C-MTEB benchmark!** :tada: :tada:
- 08/01/2023: We release the [Chinese Massive Text Embedding Benchmark](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB) (**C-MTEB**), consisting of 31 test dataset.
</details>
## Model List
`bge` is short for `BAAI general embedding`.
| Model | Language | | Description | query instruction for retrieval [1] |
|:-------------------------------|:--------:| :--------:| :--------:|:--------:|
| [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) | Multilingual | [Inference](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/BGE_M3#usage) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/BGE_M3) | Multi-Functionality(dense retrieval, sparse retrieval, multi-vector(colbert)), Multi-Linguality, and Multi-Granularity(8192 tokens) | |
| [BAAI/llm-embedder](https://huggingface.co/BAAI/llm-embedder) | English | [Inference](./FlagEmbedding/llm_embedder/README.md) [Fine-tune](./FlagEmbedding/llm_embedder/README.md) | a unified embedding model to support diverse retrieval augmentation needs for LLMs | See [README](./FlagEmbedding/llm_embedder/README.md) |
| [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | |
| [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | |
| [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-large-en](https://huggingface.co/BAAI/bge-large-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [MTEB](https://huggingface.co/spaces/mteb/leaderboard) leaderboard | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-base-en](https://huggingface.co/BAAI/bge-base-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-en` | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) |a small-scale model but with competitive performance | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB) benchmark | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-zh` | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a small-scale model but with competitive performance | `为这个句子生成表示以用于检索相关文章:` |
[1\]: If you need to search the relevant passages to a query, we suggest to add the instruction to the query; in other cases, no instruction is needed, just use the original query directly. In all cases, **no instruction** needs to be added to passages.
[2\]: Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. To balance the accuracy and time cost, cross-encoder is widely used to re-rank top-k documents retrieved by other simple models.
For examples, use bge embedding model to retrieve top 100 relevant documents, and then use bge reranker to re-rank the top 100 document to get the final top-3 results.
All models have been uploaded to Huggingface Hub, and you can see them at https://huggingface.co/BAAI.
If you cannot open the Huggingface Hub, you also can download the models at https://model.baai.ac.cn/models .
## Frequently asked questions
<details>
<summary>1. How to fine-tune bge embedding model?</summary>
<!-- ### How to fine-tune bge embedding model? -->
Following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) to prepare data and fine-tune your model.
Some suggestions:
- Mine hard negatives following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune#hard-negatives), which can improve the retrieval performance.
- If you pre-train bge on your data, the pre-trained model cannot be directly used to calculate similarity, and it must be fine-tuned with contrastive learning before computing similarity.
- If the accuracy of the fine-tuned model is still not high, it is recommended to use/fine-tune the cross-encoder model (bge-reranker) to re-rank top-k results. Hard negatives also are needed to fine-tune reranker.
</details>
<details>
<summary>2. The similarity score between two dissimilar sentences is higher than 0.5</summary>
<!-- ### The similarity score between two dissimilar sentences is higher than 0.5 -->
**Suggest to use bge v1.5, which alleviates the issue of the similarity distribution.**
Since we finetune the models by contrastive learning with a temperature of 0.01,
the similarity distribution of the current BGE model is about in the interval \[0.6, 1\].
So a similarity score greater than 0.5 does not indicate that the two sentences are similar.
For downstream tasks, such as passage retrieval or semantic similarity,
**what matters is the relative order of the scores, not the absolute value.**
If you need to filter similar sentences based on a similarity threshold,
please select an appropriate similarity threshold based on the similarity distribution on your data (such as 0.8, 0.85, or even 0.9).
</details>
<details>
<summary>3. When does the query instruction need to be used</summary>
<!-- ### When does the query instruction need to be used -->
For the `bge-*-v1.5`, we improve its retrieval ability when not using instruction.
No instruction only has a slight degradation in retrieval performance compared with using instruction.
So you can generate embedding without instruction in all cases for convenience.
For a retrieval task that uses short queries to find long related documents,
it is recommended to add instructions for these short queries.
**The best method to decide whether to add instructions for queries is choosing the setting that achieves better performance on your task.**
In all cases, the documents/passages do not need to add the instruction.
</details>
## Usage
### Usage for Embedding Model
Here are some examples for using `bge` models with
[FlagEmbedding](#using-flagembedding), [Sentence-Transformers](#using-sentence-transformers), [Langchain](#using-langchain), or [Huggingface Transformers](#using-huggingface-transformers).
#### Using FlagEmbedding
```
pip install -U FlagEmbedding
```
If it doesn't work for you, you can see [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md) for more methods to install FlagEmbedding.
```python
from FlagEmbedding import FlagModel
sentences_1 = ["样例数据-1", "样例数据-2"]
sentences_2 = ["样例数据-3", "样例数据-4"]
model = FlagModel('BAAI/bge-large-zh-v1.5',
query_instruction_for_retrieval="为这个句子生成表示以用于检索相关文章:",
use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation
embeddings_1 = model.encode(sentences_1)
embeddings_2 = model.encode(sentences_2)
similarity = embeddings_1 @ embeddings_2.T
print(similarity)
# for s2p(short query to long passage) retrieval task, suggest to use encode_queries() which will automatically add the instruction to each query
# corpus in retrieval task can still use encode() or encode_corpus(), since they don't need instruction
queries = ['query_1', 'query_2']
passages = ["样例文档-1", "样例文档-2"]
q_embeddings = model.encode_queries(queries)
p_embeddings = model.encode(passages)
scores = q_embeddings @ p_embeddings.T
```
For the value of the argument `query_instruction_for_retrieval`, see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list).
By default, FlagModel will use all available GPUs when encoding. Please set `os.environ["CUDA_VISIBLE_DEVICES"]` to select specific GPUs.
You also can set `os.environ["CUDA_VISIBLE_DEVICES"]=""` to make all GPUs unavailable.
#### Using Sentence-Transformers
You can also use the `bge` models with [sentence-transformers](https://www.SBERT.net):
```
pip install -U sentence-transformers
```
```python
from sentence_transformers import SentenceTransformer
sentences_1 = ["样例数据-1", "样例数据-2"]
sentences_2 = ["样例数据-3", "样例数据-4"]
model = SentenceTransformer('BAAI/bge-large-zh-v1.5')
embeddings_1 = model.encode(sentences_1, normalize_embeddings=True)
embeddings_2 = model.encode(sentences_2, normalize_embeddings=True)
similarity = embeddings_1 @ embeddings_2.T
print(similarity)
```
For s2p(short query to long passage) retrieval task,
each short query should start with an instruction (instructions see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list)).
But the instruction is not needed for passages.
```python
from sentence_transformers import SentenceTransformer
queries = ['query_1', 'query_2']
passages = ["样例文档-1", "样例文档-2"]
instruction = "为这个句子生成表示以用于检索相关文章:"
model = SentenceTransformer('BAAI/bge-large-zh-v1.5')
q_embeddings = model.encode([instruction+q for q in queries], normalize_embeddings=True)
p_embeddings = model.encode(passages, normalize_embeddings=True)
scores = q_embeddings @ p_embeddings.T
```
#### Using Langchain
You can use `bge` in langchain like this:
```python
from langchain.embeddings import HuggingFaceBgeEmbeddings
model_name = "BAAI/bge-large-en-v1.5"
model_kwargs = {'device': 'cuda'}
encode_kwargs = {'normalize_embeddings': True} # set True to compute cosine similarity
model = HuggingFaceBgeEmbeddings(
model_name=model_name,
model_kwargs=model_kwargs,
encode_kwargs=encode_kwargs,
query_instruction="为这个句子生成表示以用于检索相关文章:"
)
model.query_instruction = "为这个句子生成表示以用于检索相关文章:"
```
#### Using HuggingFace Transformers
With the transformers package, you can use the model like this: First, you pass your input through the transformer model, then you select the last hidden state of the first token (i.e., [CLS]) as the sentence embedding.
```python
from transformers import AutoTokenizer, AutoModel
import torch
# Sentences we want sentence embeddings for
sentences = ["样例数据-1", "样例数据-2"]
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-large-zh-v1.5')
model = AutoModel.from_pretrained('BAAI/bge-large-zh-v1.5')
model.eval()
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# for s2p(short query to long passage) retrieval task, add an instruction to query (not add instruction for passages)
# encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, cls pooling.
sentence_embeddings = model_output[0][:, 0]
# normalize embeddings
sentence_embeddings = torch.nn.functional.normalize(sentence_embeddings, p=2, dim=1)
print("Sentence embeddings:", sentence_embeddings)
```
### Usage for Reranker
Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding.
You can get a relevance score by inputting query and passage to the reranker.
The reranker is optimized based cross-entropy loss, so the relevance score is not bounded to a specific range.
#### Using FlagEmbedding
```
pip install -U FlagEmbedding
```
Get relevance scores (higher scores indicate more relevance):
```python
from FlagEmbedding import FlagReranker
reranker = FlagReranker('BAAI/bge-reranker-large', use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation
score = reranker.compute_score(['query', 'passage'])
print(score)
scores = reranker.compute_score([['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']])
print(scores)
```
#### Using Huggingface transformers
```python
import torch
from transformers import AutoModelForSequenceClassification, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-reranker-large')
model = AutoModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-large')
model.eval()
pairs = [['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']]
with torch.no_grad():
inputs = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt', max_length=512)
scores = model(**inputs, return_dict=True).logits.view(-1, ).float()
print(scores)
```
## Evaluation
`baai-general-embedding` models achieve **state-of-the-art performance on both MTEB and C-MTEB leaderboard!**
For more details and evaluation tools see our [scripts](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md).
- **MTEB**:
| Model Name | Dimension | Sequence Length | Average (56) | Retrieval (15) |Clustering (11) | Pair Classification (3) | Reranking (4) | STS (10) | Summarization (1) | Classification (12) |
|:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|
| [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | **64.23** | **54.29** | 46.08 | 87.12 | 60.03 | 83.11 | 31.61 | 75.97 |
| [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 768 | 512 | 63.55 | 53.25 | 45.77 | 86.55 | 58.86 | 82.4 | 31.07 | 75.53 |
| [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | 384 | 512 | 62.17 |51.68 | 43.82 | 84.92 | 58.36 | 81.59 | 30.12 | 74.14 |
| [bge-large-en](https://huggingface.co/BAAI/bge-large-en) | 1024 | 512 | 63.98 | 53.9 | 46.98 | 85.8 | 59.48 | 81.56 | 32.06 | 76.21 |
| [bge-base-en](https://huggingface.co/BAAI/bge-base-en) | 768 | 512 | 63.36 | 53.0 | 46.32 | 85.86 | 58.7 | 81.84 | 29.27 | 75.27 |
| [gte-large](https://huggingface.co/thenlper/gte-large) | 1024 | 512 | 63.13 | 52.22 | 46.84 | 85.00 | 59.13 | 83.35 | 31.66 | 73.33 |
| [gte-base](https://huggingface.co/thenlper/gte-base) | 768 | 512 | 62.39 | 51.14 | 46.2 | 84.57 | 58.61 | 82.3 | 31.17 | 73.01 |
| [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1024| 512 | 62.25 | 50.56 | 44.49 | 86.03 | 56.61 | 82.05 | 30.19 | 75.24 |
| [bge-small-en](https://huggingface.co/BAAI/bge-small-en) | 384 | 512 | 62.11 | 51.82 | 44.31 | 83.78 | 57.97 | 80.72 | 30.53 | 74.37 |
| [instructor-xl](https://huggingface.co/hkunlp/instructor-xl) | 768 | 512 | 61.79 | 49.26 | 44.74 | 86.62 | 57.29 | 83.06 | 32.32 | 61.79 |
| [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 768 | 512 | 61.5 | 50.29 | 43.80 | 85.73 | 55.91 | 81.05 | 30.28 | 73.84 |
| [gte-small](https://huggingface.co/thenlper/gte-small) | 384 | 512 | 61.36 | 49.46 | 44.89 | 83.54 | 57.7 | 82.07 | 30.42 | 72.31 |
| [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | 1536 | 8192 | 60.99 | 49.25 | 45.9 | 84.89 | 56.32 | 80.97 | 30.8 | 70.93 |
| [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 384 | 512 | 59.93 | 49.04 | 39.92 | 84.67 | 54.32 | 80.39 | 31.16 | 72.94 |
| [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 768 | 512 | 59.51 | 42.24 | 43.72 | 85.06 | 56.42 | 82.63 | 30.08 | 73.42 |
| [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 768 | 514 | 57.78 | 43.81 | 43.69 | 83.04 | 59.36 | 80.28 | 27.49 | 65.07 |
| [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 4096 | 2048 | 57.59 | 48.22 | 38.93 | 81.9 | 55.65 | 77.74 | 33.6 | 66.19 |
- **C-MTEB**:
We create the benchmark C-MTEB for Chinese text embedding which consists of 31 datasets from 6 tasks.
Please refer to [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md) for a detailed introduction.
| Model | Embedding dimension | Avg | Retrieval | STS | PairClassification | Classification | Reranking | Clustering |
|:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| [**BAAI/bge-large-zh-v1.5**](https://huggingface.co/BAAI/bge-large-zh-v1.5) | 1024 | **64.53** | 70.46 | 56.25 | 81.6 | 69.13 | 65.84 | 48.99 |
| [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | 768 | 63.13 | 69.49 | 53.72 | 79.75 | 68.07 | 65.39 | 47.53 |
| [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | 512 | 57.82 | 61.77 | 49.11 | 70.41 | 63.96 | 60.92 | 44.18 |
| [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | 1024 | 64.20 | 71.53 | 54.98 | 78.94 | 68.32 | 65.11 | 48.39 |
| [bge-large-zh-noinstruct](https://huggingface.co/BAAI/bge-large-zh-noinstruct) | 1024 | 63.53 | 70.55 | 53 | 76.77 | 68.58 | 64.91 | 50.01 |
| [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | 768 | 62.96 | 69.53 | 54.12 | 77.5 | 67.07 | 64.91 | 47.63 |
| [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 1024 | 58.79 | 63.66 | 48.44 | 69.89 | 67.34 | 56.00 | 48.23 |
| [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | 512 | 58.27 | 63.07 | 49.45 | 70.35 | 63.64 | 61.48 | 45.09 |
| [m3e-base](https://huggingface.co/moka-ai/m3e-base) | 768 | 57.10 | 56.91 | 50.47 | 63.99 | 67.52 | 59.34 | 47.68 |
| [m3e-large](https://huggingface.co/moka-ai/m3e-large) | 1024 | 57.05 | 54.75 | 50.42 | 64.3 | 68.2 | 59.66 | 48.88 |
| [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 768 | 55.48 | 61.63 | 46.49 | 67.07 | 65.35 | 54.35 | 40.68 |
| [multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) | 384 | 55.38 | 59.95 | 45.27 | 66.45 | 65.85 | 53.86 | 45.26 |
| [text-embedding-ada-002(OpenAI)](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) | 1536 | 53.02 | 52.0 | 43.35 | 69.56 | 64.31 | 54.28 | 45.68 |
| [luotuo](https://huggingface.co/silk-road/luotuo-bert-medium) | 1024 | 49.37 | 44.4 | 42.78 | 66.62 | 61 | 49.25 | 44.39 |
| [text2vec-base](https://huggingface.co/shibing624/text2vec-base-chinese) | 768 | 47.63 | 38.79 | 43.41 | 67.41 | 62.19 | 49.45 | 37.66 |
| [text2vec-large](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 1024 | 47.36 | 41.94 | 44.97 | 70.86 | 60.66 | 49.16 | 30.02 |
- **Reranking**:
See [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/) for evaluation script.
| Model | T2Reranking | T2RerankingZh2En\* | T2RerankingEn2Zh\* | MMarcoReranking | CMedQAv1 | CMedQAv2 | Avg |
|:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| text2vec-base-multilingual | 64.66 | 62.94 | 62.51 | 14.37 | 48.46 | 48.6 | 50.26 |
| multilingual-e5-small | 65.62 | 60.94 | 56.41 | 29.91 | 67.26 | 66.54 | 57.78 |
| multilingual-e5-large | 64.55 | 61.61 | 54.28 | 28.6 | 67.42 | 67.92 | 57.4 |
| multilingual-e5-base | 64.21 | 62.13 | 54.68 | 29.5 | 66.23 | 66.98 | 57.29 |
| m3e-base | 66.03 | 62.74 | 56.07 | 17.51 | 77.05 | 76.76 | 59.36 |
| m3e-large | 66.13 | 62.72 | 56.1 | 16.46 | 77.76 | 78.27 | 59.57 |
| bge-base-zh-v1.5 | 66.49 | 63.25 | 57.02 | 29.74 | 80.47 | 84.88 | 63.64 |
| bge-large-zh-v1.5 | 65.74 | 63.39 | 57.03 | 28.74 | 83.45 | 85.44 | 63.97 |
| [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | 67.28 | 63.95 | 60.45 | 35.46 | 81.26 | 84.1 | 65.42 |
| [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | 67.6 | 64.03 | 61.44 | 37.16 | 82.15 | 84.18 | 66.09 |
\* : T2RerankingZh2En and T2RerankingEn2Zh are cross-language retrieval tasks
## Train
### BAAI Embedding
We pre-train the models using [retromae](https://github.com/staoxiao/RetroMAE) and train them on large-scale pairs data using contrastive learning.
**You can fine-tune the embedding model on your data following our [examples](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune).**
We also provide a [pre-train example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/pretrain).
Note that the goal of pre-training is to reconstruct the text, and the pre-trained model cannot be used for similarity calculation directly, it needs to be fine-tuned.
More training details for bge see [baai_general_embedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md).
### BGE Reranker
Cross-encoder will perform full-attention over the input pair,
which is more accurate than embedding model (i.e., bi-encoder) but more time-consuming than embedding model.
Therefore, it can be used to re-rank the top-k documents returned by embedding model.
We train the cross-encoder on a multilingual pair data,
The data format is the same as embedding model, so you can fine-tune it easily following our [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker).
More details please refer to [./FlagEmbedding/reranker/README.md](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker)
## Contact
If you have any question or suggestion related to this project, feel free to open an issue or pull request.
You also can email Shitao Xiao([email protected]) and Zheng Liu([email protected]).
## Citation
If you find this repository useful, please consider giving a star :star: and citation
```
@misc{bge_embedding,
title={C-Pack: Packaged Resources To Advance General Chinese Embedding},
author={Shitao Xiao and Zheng Liu and Peitian Zhang and Niklas Muennighoff},
year={2023},
eprint={2309.07597},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## License
FlagEmbedding is licensed under the [MIT License](https://github.com/FlagOpen/FlagEmbedding/blob/master/LICENSE). The released models can be used for commercial purposes free of charge. | [
"SEMANTIC_SIMILARITY",
"SUMMARIZATION"
]
| [
"BEAR"
]
| Non_BioNLP |
<h1 align="center">FlagEmbedding</h1>
<h4 align="center">
<p>
<a href=#model-list>Model List</a> |
<a href=#frequently-asked-questions>FAQ</a> |
<a href=#usage>Usage</a> |
<a href="#evaluation">Evaluation</a> |
<a href="#train">Train</a> |
<a href="#contact">Contact</a> |
<a href="#citation">Citation</a> |
<a href="#license">License</a>
<p>
</h4>
For more details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding).
If you are looking for a model that supports more languages, longer texts, and other retrieval methods, you can try using [bge-m3](https://huggingface.co/BAAI/bge-m3).
[English](README.md) | [中文](https://github.com/FlagOpen/FlagEmbedding/blob/master/README_zh.md)
FlagEmbedding focuses on retrieval-augmented LLMs, consisting of the following projects currently:
- **Long-Context LLM**: [Activation Beacon](https://github.com/FlagOpen/FlagEmbedding/tree/master/Long_LLM/activation_beacon)
- **Fine-tuning of LM** : [LM-Cocktail](https://github.com/FlagOpen/FlagEmbedding/tree/master/LM_Cocktail)
- **Dense Retrieval**: [BGE-M3](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/BGE_M3), [LLM Embedder](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/llm_embedder), [BGE Embedding](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/baai_general_embedding)
- **Reranker Model**: [BGE Reranker](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker)
- **Benchmark**: [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB)
## News
- 1/30/2024: Release **BGE-M3**, a new member to BGE model series! M3 stands for **M**ulti-linguality (100+ languages), **M**ulti-granularities (input length up to 8192), **M**ulti-Functionality (unification of dense, lexical, multi-vec/colbert retrieval).
It is the first embedding model which supports all three retrieval methods, achieving new SOTA on multi-lingual (MIRACL) and cross-lingual (MKQA) benchmarks.
[Technical Report](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/BGE_M3/BGE_M3.pdf) and [Code](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/BGE_M3). :fire:
- 1/9/2024: Release [Activation-Beacon](https://github.com/FlagOpen/FlagEmbedding/tree/master/Long_LLM/activation_beacon), an effective, efficient, compatible, and low-cost (training) method to extend the context length of LLM. [Technical Report](https://arxiv.org/abs/2401.03462) :fire:
- 12/24/2023: Release **LLaRA**, a LLaMA-7B based dense retriever, leading to state-of-the-art performances on MS MARCO and BEIR. Model and code will be open-sourced. Please stay tuned. [Technical Report](https://arxiv.org/abs/2312.15503) :fire:
- 11/23/2023: Release [LM-Cocktail](https://github.com/FlagOpen/FlagEmbedding/tree/master/LM_Cocktail), a method to maintain general capabilities during fine-tuning by merging multiple language models. [Technical Report](https://arxiv.org/abs/2311.13534) :fire:
- 10/12/2023: Release [LLM-Embedder](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/llm_embedder), a unified embedding model to support diverse retrieval augmentation needs for LLMs. [Technical Report](https://arxiv.org/pdf/2310.07554.pdf)
- 09/15/2023: The [technical report](https://arxiv.org/pdf/2309.07597.pdf) and [massive training data](https://data.baai.ac.cn/details/BAAI-MTP) of BGE has been released
- 09/12/2023: New models:
- **New reranker model**: release cross-encoder models `BAAI/bge-reranker-base` and `BAAI/bge-reranker-large`, which are more powerful than embedding model. We recommend to use/fine-tune them to re-rank top-k documents returned by embedding models.
- **update embedding model**: release `bge-*-v1.5` embedding model to alleviate the issue of the similarity distribution, and enhance its retrieval ability without instruction.
<details>
<summary>More</summary>
<!-- ### More -->
- 09/07/2023: Update [fine-tune code](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md): Add script to mine hard negatives and support adding instruction during fine-tuning.
- 08/09/2023: BGE Models are integrated into **Langchain**, you can use it like [this](#using-langchain); C-MTEB **leaderboard** is [available](https://huggingface.co/spaces/mteb/leaderboard).
- 08/05/2023: Release base-scale and small-scale models, **best performance among the models of the same size 🤗**
- 08/02/2023: Release `bge-large-*`(short for BAAI General Embedding) Models, **rank 1st on MTEB and C-MTEB benchmark!** :tada: :tada:
- 08/01/2023: We release the [Chinese Massive Text Embedding Benchmark](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB) (**C-MTEB**), consisting of 31 test dataset.
</details>
## Model List
`bge` is short for `BAAI general embedding`.
| Model | Language | | Description | query instruction for retrieval [1] |
|:-------------------------------|:--------:| :--------:| :--------:|:--------:|
| [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) | Multilingual | [Inference](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/BGE_M3#usage) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/BGE_M3) | Multi-Functionality(dense retrieval, sparse retrieval, multi-vector(colbert)), Multi-Linguality, and Multi-Granularity(8192 tokens) | |
| [BAAI/llm-embedder](https://huggingface.co/BAAI/llm-embedder) | English | [Inference](./FlagEmbedding/llm_embedder/README.md) [Fine-tune](./FlagEmbedding/llm_embedder/README.md) | a unified embedding model to support diverse retrieval augmentation needs for LLMs | See [README](./FlagEmbedding/llm_embedder/README.md) |
| [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | |
| [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | |
| [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-large-en](https://huggingface.co/BAAI/bge-large-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [MTEB](https://huggingface.co/spaces/mteb/leaderboard) leaderboard | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-base-en](https://huggingface.co/BAAI/bge-base-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-en` | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) |a small-scale model but with competitive performance | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB) benchmark | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-zh` | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a small-scale model but with competitive performance | `为这个句子生成表示以用于检索相关文章:` |
[1\]: If you need to search the relevant passages to a query, we suggest to add the instruction to the query; in other cases, no instruction is needed, just use the original query directly. In all cases, **no instruction** needs to be added to passages.
[2\]: Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. To balance the accuracy and time cost, cross-encoder is widely used to re-rank top-k documents retrieved by other simple models.
For examples, use bge embedding model to retrieve top 100 relevant documents, and then use bge reranker to re-rank the top 100 document to get the final top-3 results.
All models have been uploaded to Huggingface Hub, and you can see them at https://huggingface.co/BAAI.
If you cannot open the Huggingface Hub, you also can download the models at https://model.baai.ac.cn/models .
## Frequently asked questions
<details>
<summary>1. How to fine-tune bge embedding model?</summary>
<!-- ### How to fine-tune bge embedding model? -->
Following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) to prepare data and fine-tune your model.
Some suggestions:
- Mine hard negatives following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune#hard-negatives), which can improve the retrieval performance.
- If you pre-train bge on your data, the pre-trained model cannot be directly used to calculate similarity, and it must be fine-tuned with contrastive learning before computing similarity.
- If the accuracy of the fine-tuned model is still not high, it is recommended to use/fine-tune the cross-encoder model (bge-reranker) to re-rank top-k results. Hard negatives also are needed to fine-tune reranker.
</details>
<details>
<summary>2. The similarity score between two dissimilar sentences is higher than 0.5</summary>
<!-- ### The similarity score between two dissimilar sentences is higher than 0.5 -->
**Suggest to use bge v1.5, which alleviates the issue of the similarity distribution.**
Since we finetune the models by contrastive learning with a temperature of 0.01,
the similarity distribution of the current BGE model is about in the interval \[0.6, 1\].
So a similarity score greater than 0.5 does not indicate that the two sentences are similar.
For downstream tasks, such as passage retrieval or semantic similarity,
**what matters is the relative order of the scores, not the absolute value.**
If you need to filter similar sentences based on a similarity threshold,
please select an appropriate similarity threshold based on the similarity distribution on your data (such as 0.8, 0.85, or even 0.9).
</details>
<details>
<summary>3. When does the query instruction need to be used</summary>
<!-- ### When does the query instruction need to be used -->
For the `bge-*-v1.5`, we improve its retrieval ability when not using instruction.
No instruction only has a slight degradation in retrieval performance compared with using instruction.
So you can generate embedding without instruction in all cases for convenience.
For a retrieval task that uses short queries to find long related documents,
it is recommended to add instructions for these short queries.
**The best method to decide whether to add instructions for queries is choosing the setting that achieves better performance on your task.**
In all cases, the documents/passages do not need to add the instruction.
</details>
## Usage
### Usage for Embedding Model
Here are some examples for using `bge` models with
[FlagEmbedding](#using-flagembedding), [Sentence-Transformers](#using-sentence-transformers), [Langchain](#using-langchain), or [Huggingface Transformers](#using-huggingface-transformers).
#### Using FlagEmbedding
```
pip install -U FlagEmbedding
```
If it doesn't work for you, you can see [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md) for more methods to install FlagEmbedding.
```python
from FlagEmbedding import FlagModel
sentences_1 = ["样例数据-1", "样例数据-2"]
sentences_2 = ["样例数据-3", "样例数据-4"]
model = FlagModel('BAAI/bge-large-zh-v1.5',
query_instruction_for_retrieval="为这个句子生成表示以用于检索相关文章:",
use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation
embeddings_1 = model.encode(sentences_1)
embeddings_2 = model.encode(sentences_2)
similarity = embeddings_1 @ embeddings_2.T
print(similarity)
# for s2p(short query to long passage) retrieval task, suggest to use encode_queries() which will automatically add the instruction to each query
# corpus in retrieval task can still use encode() or encode_corpus(), since they don't need instruction
queries = ['query_1', 'query_2']
passages = ["样例文档-1", "样例文档-2"]
q_embeddings = model.encode_queries(queries)
p_embeddings = model.encode(passages)
scores = q_embeddings @ p_embeddings.T
```
For the value of the argument `query_instruction_for_retrieval`, see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list).
By default, FlagModel will use all available GPUs when encoding. Please set `os.environ["CUDA_VISIBLE_DEVICES"]` to select specific GPUs.
You also can set `os.environ["CUDA_VISIBLE_DEVICES"]=""` to make all GPUs unavailable.
#### Using Sentence-Transformers
You can also use the `bge` models with [sentence-transformers](https://www.SBERT.net):
```
pip install -U sentence-transformers
```
```python
from sentence_transformers import SentenceTransformer
sentences_1 = ["样例数据-1", "样例数据-2"]
sentences_2 = ["样例数据-3", "样例数据-4"]
model = SentenceTransformer('BAAI/bge-large-zh-v1.5')
embeddings_1 = model.encode(sentences_1, normalize_embeddings=True)
embeddings_2 = model.encode(sentences_2, normalize_embeddings=True)
similarity = embeddings_1 @ embeddings_2.T
print(similarity)
```
For s2p(short query to long passage) retrieval task,
each short query should start with an instruction (instructions see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list)).
But the instruction is not needed for passages.
```python
from sentence_transformers import SentenceTransformer
queries = ['query_1', 'query_2']
passages = ["样例文档-1", "样例文档-2"]
instruction = "为这个句子生成表示以用于检索相关文章:"
model = SentenceTransformer('BAAI/bge-large-zh-v1.5')
q_embeddings = model.encode([instruction+q for q in queries], normalize_embeddings=True)
p_embeddings = model.encode(passages, normalize_embeddings=True)
scores = q_embeddings @ p_embeddings.T
```
#### Using Langchain
You can use `bge` in langchain like this:
```python
from langchain.embeddings import HuggingFaceBgeEmbeddings
model_name = "BAAI/bge-large-en-v1.5"
model_kwargs = {'device': 'cuda'}
encode_kwargs = {'normalize_embeddings': True} # set True to compute cosine similarity
model = HuggingFaceBgeEmbeddings(
model_name=model_name,
model_kwargs=model_kwargs,
encode_kwargs=encode_kwargs,
query_instruction="为这个句子生成表示以用于检索相关文章:"
)
model.query_instruction = "为这个句子生成表示以用于检索相关文章:"
```
#### Using HuggingFace Transformers
With the transformers package, you can use the model like this: First, you pass your input through the transformer model, then you select the last hidden state of the first token (i.e., [CLS]) as the sentence embedding.
```python
from transformers import AutoTokenizer, AutoModel
import torch
# Sentences we want sentence embeddings for
sentences = ["样例数据-1", "样例数据-2"]
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-large-zh-v1.5')
model = AutoModel.from_pretrained('BAAI/bge-large-zh-v1.5')
model.eval()
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# for s2p(short query to long passage) retrieval task, add an instruction to query (not add instruction for passages)
# encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, cls pooling.
sentence_embeddings = model_output[0][:, 0]
# normalize embeddings
sentence_embeddings = torch.nn.functional.normalize(sentence_embeddings, p=2, dim=1)
print("Sentence embeddings:", sentence_embeddings)
```
### Usage for Reranker
Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding.
You can get a relevance score by inputting query and passage to the reranker.
The reranker is optimized based cross-entropy loss, so the relevance score is not bounded to a specific range.
#### Using FlagEmbedding
```
pip install -U FlagEmbedding
```
Get relevance scores (higher scores indicate more relevance):
```python
from FlagEmbedding import FlagReranker
reranker = FlagReranker('BAAI/bge-reranker-large', use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation
score = reranker.compute_score(['query', 'passage'])
print(score)
scores = reranker.compute_score([['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']])
print(scores)
```
#### Using Huggingface transformers
```python
import torch
from transformers import AutoModelForSequenceClassification, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-reranker-large')
model = AutoModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-large')
model.eval()
pairs = [['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']]
with torch.no_grad():
inputs = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt', max_length=512)
scores = model(**inputs, return_dict=True).logits.view(-1, ).float()
print(scores)
```
## Evaluation
`baai-general-embedding` models achieve **state-of-the-art performance on both MTEB and C-MTEB leaderboard!**
For more details and evaluation tools see our [scripts](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md).
- **MTEB**:
| Model Name | Dimension | Sequence Length | Average (56) | Retrieval (15) |Clustering (11) | Pair Classification (3) | Reranking (4) | STS (10) | Summarization (1) | Classification (12) |
|:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|
| [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | **64.23** | **54.29** | 46.08 | 87.12 | 60.03 | 83.11 | 31.61 | 75.97 |
| [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 768 | 512 | 63.55 | 53.25 | 45.77 | 86.55 | 58.86 | 82.4 | 31.07 | 75.53 |
| [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | 384 | 512 | 62.17 |51.68 | 43.82 | 84.92 | 58.36 | 81.59 | 30.12 | 74.14 |
| [bge-large-en](https://huggingface.co/BAAI/bge-large-en) | 1024 | 512 | 63.98 | 53.9 | 46.98 | 85.8 | 59.48 | 81.56 | 32.06 | 76.21 |
| [bge-base-en](https://huggingface.co/BAAI/bge-base-en) | 768 | 512 | 63.36 | 53.0 | 46.32 | 85.86 | 58.7 | 81.84 | 29.27 | 75.27 |
| [gte-large](https://huggingface.co/thenlper/gte-large) | 1024 | 512 | 63.13 | 52.22 | 46.84 | 85.00 | 59.13 | 83.35 | 31.66 | 73.33 |
| [gte-base](https://huggingface.co/thenlper/gte-base) | 768 | 512 | 62.39 | 51.14 | 46.2 | 84.57 | 58.61 | 82.3 | 31.17 | 73.01 |
| [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1024| 512 | 62.25 | 50.56 | 44.49 | 86.03 | 56.61 | 82.05 | 30.19 | 75.24 |
| [bge-small-en](https://huggingface.co/BAAI/bge-small-en) | 384 | 512 | 62.11 | 51.82 | 44.31 | 83.78 | 57.97 | 80.72 | 30.53 | 74.37 |
| [instructor-xl](https://huggingface.co/hkunlp/instructor-xl) | 768 | 512 | 61.79 | 49.26 | 44.74 | 86.62 | 57.29 | 83.06 | 32.32 | 61.79 |
| [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 768 | 512 | 61.5 | 50.29 | 43.80 | 85.73 | 55.91 | 81.05 | 30.28 | 73.84 |
| [gte-small](https://huggingface.co/thenlper/gte-small) | 384 | 512 | 61.36 | 49.46 | 44.89 | 83.54 | 57.7 | 82.07 | 30.42 | 72.31 |
| [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | 1536 | 8192 | 60.99 | 49.25 | 45.9 | 84.89 | 56.32 | 80.97 | 30.8 | 70.93 |
| [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 384 | 512 | 59.93 | 49.04 | 39.92 | 84.67 | 54.32 | 80.39 | 31.16 | 72.94 |
| [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 768 | 512 | 59.51 | 42.24 | 43.72 | 85.06 | 56.42 | 82.63 | 30.08 | 73.42 |
| [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 768 | 514 | 57.78 | 43.81 | 43.69 | 83.04 | 59.36 | 80.28 | 27.49 | 65.07 |
| [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 4096 | 2048 | 57.59 | 48.22 | 38.93 | 81.9 | 55.65 | 77.74 | 33.6 | 66.19 |
- **C-MTEB**:
We create the benchmark C-MTEB for Chinese text embedding which consists of 31 datasets from 6 tasks.
Please refer to [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md) for a detailed introduction.
| Model | Embedding dimension | Avg | Retrieval | STS | PairClassification | Classification | Reranking | Clustering |
|:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| [**BAAI/bge-large-zh-v1.5**](https://huggingface.co/BAAI/bge-large-zh-v1.5) | 1024 | **64.53** | 70.46 | 56.25 | 81.6 | 69.13 | 65.84 | 48.99 |
| [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | 768 | 63.13 | 69.49 | 53.72 | 79.75 | 68.07 | 65.39 | 47.53 |
| [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | 512 | 57.82 | 61.77 | 49.11 | 70.41 | 63.96 | 60.92 | 44.18 |
| [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | 1024 | 64.20 | 71.53 | 54.98 | 78.94 | 68.32 | 65.11 | 48.39 |
| [bge-large-zh-noinstruct](https://huggingface.co/BAAI/bge-large-zh-noinstruct) | 1024 | 63.53 | 70.55 | 53 | 76.77 | 68.58 | 64.91 | 50.01 |
| [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | 768 | 62.96 | 69.53 | 54.12 | 77.5 | 67.07 | 64.91 | 47.63 |
| [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 1024 | 58.79 | 63.66 | 48.44 | 69.89 | 67.34 | 56.00 | 48.23 |
| [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | 512 | 58.27 | 63.07 | 49.45 | 70.35 | 63.64 | 61.48 | 45.09 |
| [m3e-base](https://huggingface.co/moka-ai/m3e-base) | 768 | 57.10 | 56.91 | 50.47 | 63.99 | 67.52 | 59.34 | 47.68 |
| [m3e-large](https://huggingface.co/moka-ai/m3e-large) | 1024 | 57.05 | 54.75 | 50.42 | 64.3 | 68.2 | 59.66 | 48.88 |
| [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 768 | 55.48 | 61.63 | 46.49 | 67.07 | 65.35 | 54.35 | 40.68 |
| [multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) | 384 | 55.38 | 59.95 | 45.27 | 66.45 | 65.85 | 53.86 | 45.26 |
| [text-embedding-ada-002(OpenAI)](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) | 1536 | 53.02 | 52.0 | 43.35 | 69.56 | 64.31 | 54.28 | 45.68 |
| [luotuo](https://huggingface.co/silk-road/luotuo-bert-medium) | 1024 | 49.37 | 44.4 | 42.78 | 66.62 | 61 | 49.25 | 44.39 |
| [text2vec-base](https://huggingface.co/shibing624/text2vec-base-chinese) | 768 | 47.63 | 38.79 | 43.41 | 67.41 | 62.19 | 49.45 | 37.66 |
| [text2vec-large](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 1024 | 47.36 | 41.94 | 44.97 | 70.86 | 60.66 | 49.16 | 30.02 |
- **Reranking**:
See [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/) for evaluation script.
| Model | T2Reranking | T2RerankingZh2En\* | T2RerankingEn2Zh\* | MMarcoReranking | CMedQAv1 | CMedQAv2 | Avg |
|:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| text2vec-base-multilingual | 64.66 | 62.94 | 62.51 | 14.37 | 48.46 | 48.6 | 50.26 |
| multilingual-e5-small | 65.62 | 60.94 | 56.41 | 29.91 | 67.26 | 66.54 | 57.78 |
| multilingual-e5-large | 64.55 | 61.61 | 54.28 | 28.6 | 67.42 | 67.92 | 57.4 |
| multilingual-e5-base | 64.21 | 62.13 | 54.68 | 29.5 | 66.23 | 66.98 | 57.29 |
| m3e-base | 66.03 | 62.74 | 56.07 | 17.51 | 77.05 | 76.76 | 59.36 |
| m3e-large | 66.13 | 62.72 | 56.1 | 16.46 | 77.76 | 78.27 | 59.57 |
| bge-base-zh-v1.5 | 66.49 | 63.25 | 57.02 | 29.74 | 80.47 | 84.88 | 63.64 |
| bge-large-zh-v1.5 | 65.74 | 63.39 | 57.03 | 28.74 | 83.45 | 85.44 | 63.97 |
| [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | 67.28 | 63.95 | 60.45 | 35.46 | 81.26 | 84.1 | 65.42 |
| [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | 67.6 | 64.03 | 61.44 | 37.16 | 82.15 | 84.18 | 66.09 |
\* : T2RerankingZh2En and T2RerankingEn2Zh are cross-language retrieval tasks
## Train
### BAAI Embedding
We pre-train the models using [retromae](https://github.com/staoxiao/RetroMAE) and train them on large-scale pairs data using contrastive learning.
**You can fine-tune the embedding model on your data following our [examples](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune).**
We also provide a [pre-train example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/pretrain).
Note that the goal of pre-training is to reconstruct the text, and the pre-trained model cannot be used for similarity calculation directly, it needs to be fine-tuned.
More training details for bge see [baai_general_embedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md).
### BGE Reranker
Cross-encoder will perform full-attention over the input pair,
which is more accurate than embedding model (i.e., bi-encoder) but more time-consuming than embedding model.
Therefore, it can be used to re-rank the top-k documents returned by embedding model.
We train the cross-encoder on a multilingual pair data,
The data format is the same as embedding model, so you can fine-tune it easily following our [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker).
More details please refer to [./FlagEmbedding/reranker/README.md](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker)
## Contact
If you have any question or suggestion related to this project, feel free to open an issue or pull request.
You also can email Shitao Xiao([email protected]) and Zheng Liu([email protected]).
## Citation
If you find this repository useful, please consider giving a star :star: and citation
```
@misc{bge_embedding,
title={C-Pack: Packaged Resources To Advance General Chinese Embedding},
author={Shitao Xiao and Zheng Liu and Peitian Zhang and Niklas Muennighoff},
year={2023},
eprint={2309.07597},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## License
FlagEmbedding is licensed under the [MIT License](https://github.com/FlagOpen/FlagEmbedding/blob/master/LICENSE). The released models can be used for commercial purposes free of charge. | {"language": ["zh"], "license": "mit", "tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "transformers"]} |
CAiRE/UniVaR-lambda-80 | CAiRE | sentence-similarity | [
"sentence-transformers",
"safetensors",
"nomic_bert",
"feature-extraction",
"sentence-similarity",
"mteb",
"transformers",
"transformers.js",
"custom_code",
"en",
"arxiv:2402.01613",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
]
| 2024-06-14T17:55:40 | 2024-06-14T17:56:29 | 8 | 0 | ---
language:
- en
library_name: sentence-transformers
license: apache-2.0
pipeline_tag: sentence-similarity
tags:
- feature-extraction
- sentence-similarity
- mteb
- transformers
- transformers.js
model-index:
- name: epoch_0_model
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 76.8507462686567
- type: ap
value: 40.592189159090495
- type: f1
value: 71.01634655512476
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 91.51892500000001
- type: ap
value: 88.50346762975335
- type: f1
value: 91.50342077459624
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 47.364
- type: f1
value: 46.72708080922794
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 25.178
- type: map_at_10
value: 40.244
- type: map_at_100
value: 41.321999999999996
- type: map_at_1000
value: 41.331
- type: map_at_3
value: 35.016999999999996
- type: map_at_5
value: 37.99
- type: mrr_at_1
value: 25.605
- type: mrr_at_10
value: 40.422000000000004
- type: mrr_at_100
value: 41.507
- type: mrr_at_1000
value: 41.516
- type: mrr_at_3
value: 35.23
- type: mrr_at_5
value: 38.15
- type: ndcg_at_1
value: 25.178
- type: ndcg_at_10
value: 49.258
- type: ndcg_at_100
value: 53.776
- type: ndcg_at_1000
value: 53.995000000000005
- type: ndcg_at_3
value: 38.429
- type: ndcg_at_5
value: 43.803
- type: precision_at_1
value: 25.178
- type: precision_at_10
value: 7.831
- type: precision_at_100
value: 0.979
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 16.121
- type: precision_at_5
value: 12.29
- type: recall_at_1
value: 25.178
- type: recall_at_10
value: 78.307
- type: recall_at_100
value: 97.866
- type: recall_at_1000
value: 99.57300000000001
- type: recall_at_3
value: 48.364000000000004
- type: recall_at_5
value: 61.451
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 45.93034494751465
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 36.64579480054327
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 60.601310529222054
- type: mrr
value: 75.04484896451656
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 88.57797718095814
- type: cos_sim_spearman
value: 86.47064499110101
- type: euclidean_pearson
value: 87.4559602783142
- type: euclidean_spearman
value: 86.47064499110101
- type: manhattan_pearson
value: 87.7232764230245
- type: manhattan_spearman
value: 86.91222131777742
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 84.5422077922078
- type: f1
value: 84.47657456950589
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 38.48953561974464
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 32.75995857510105
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 30.008000000000003
- type: map_at_10
value: 39.51
- type: map_at_100
value: 40.841
- type: map_at_1000
value: 40.973
- type: map_at_3
value: 36.248999999999995
- type: map_at_5
value: 38.096999999999994
- type: mrr_at_1
value: 36.481
- type: mrr_at_10
value: 44.818000000000005
- type: mrr_at_100
value: 45.64
- type: mrr_at_1000
value: 45.687
- type: mrr_at_3
value: 42.036
- type: mrr_at_5
value: 43.782
- type: ndcg_at_1
value: 36.481
- type: ndcg_at_10
value: 45.152
- type: ndcg_at_100
value: 50.449
- type: ndcg_at_1000
value: 52.76499999999999
- type: ndcg_at_3
value: 40.161
- type: ndcg_at_5
value: 42.577999999999996
- type: precision_at_1
value: 36.481
- type: precision_at_10
value: 8.369
- type: precision_at_100
value: 1.373
- type: precision_at_1000
value: 0.186
- type: precision_at_3
value: 18.693
- type: precision_at_5
value: 13.533999999999999
- type: recall_at_1
value: 30.008000000000003
- type: recall_at_10
value: 56.108999999999995
- type: recall_at_100
value: 78.55499999999999
- type: recall_at_1000
value: 93.659
- type: recall_at_3
value: 41.754999999999995
- type: recall_at_5
value: 48.296
- type: map_at_1
value: 30.262
- type: map_at_10
value: 40.139
- type: map_at_100
value: 41.394
- type: map_at_1000
value: 41.526
- type: map_at_3
value: 37.155
- type: map_at_5
value: 38.785
- type: mrr_at_1
value: 38.153
- type: mrr_at_10
value: 46.369
- type: mrr_at_100
value: 47.072
- type: mrr_at_1000
value: 47.111999999999995
- type: mrr_at_3
value: 44.268
- type: mrr_at_5
value: 45.389
- type: ndcg_at_1
value: 38.153
- type: ndcg_at_10
value: 45.925
- type: ndcg_at_100
value: 50.394000000000005
- type: ndcg_at_1000
value: 52.37500000000001
- type: ndcg_at_3
value: 41.754000000000005
- type: ndcg_at_5
value: 43.574
- type: precision_at_1
value: 38.153
- type: precision_at_10
value: 8.796
- type: precision_at_100
value: 1.432
- type: precision_at_1000
value: 0.189
- type: precision_at_3
value: 20.318
- type: precision_at_5
value: 14.395
- type: recall_at_1
value: 30.262
- type: recall_at_10
value: 55.72200000000001
- type: recall_at_100
value: 74.97500000000001
- type: recall_at_1000
value: 87.342
- type: recall_at_3
value: 43.129
- type: recall_at_5
value: 48.336
- type: map_at_1
value: 39.951
- type: map_at_10
value: 51.248000000000005
- type: map_at_100
value: 52.188
- type: map_at_1000
value: 52.247
- type: map_at_3
value: 48.211
- type: map_at_5
value: 49.797000000000004
- type: mrr_at_1
value: 45.329
- type: mrr_at_10
value: 54.749
- type: mrr_at_100
value: 55.367999999999995
- type: mrr_at_1000
value: 55.400000000000006
- type: mrr_at_3
value: 52.382
- type: mrr_at_5
value: 53.649
- type: ndcg_at_1
value: 45.329
- type: ndcg_at_10
value: 56.847
- type: ndcg_at_100
value: 60.738
- type: ndcg_at_1000
value: 61.976
- type: ndcg_at_3
value: 51.59
- type: ndcg_at_5
value: 53.915
- type: precision_at_1
value: 45.329
- type: precision_at_10
value: 8.959
- type: precision_at_100
value: 1.187
- type: precision_at_1000
value: 0.134
- type: precision_at_3
value: 22.612
- type: precision_at_5
value: 15.273
- type: recall_at_1
value: 39.951
- type: recall_at_10
value: 70.053
- type: recall_at_100
value: 86.996
- type: recall_at_1000
value: 95.707
- type: recall_at_3
value: 56.032000000000004
- type: recall_at_5
value: 61.629999999999995
- type: map_at_1
value: 25.566
- type: map_at_10
value: 33.207
- type: map_at_100
value: 34.166000000000004
- type: map_at_1000
value: 34.245
- type: map_at_3
value: 30.94
- type: map_at_5
value: 32.01
- type: mrr_at_1
value: 27.345000000000002
- type: mrr_at_10
value: 35.193000000000005
- type: mrr_at_100
value: 35.965
- type: mrr_at_1000
value: 36.028999999999996
- type: mrr_at_3
value: 32.806000000000004
- type: mrr_at_5
value: 34.021
- type: ndcg_at_1
value: 27.345000000000002
- type: ndcg_at_10
value: 37.891999999999996
- type: ndcg_at_100
value: 42.664
- type: ndcg_at_1000
value: 44.757000000000005
- type: ndcg_at_3
value: 33.123000000000005
- type: ndcg_at_5
value: 35.035
- type: precision_at_1
value: 27.345000000000002
- type: precision_at_10
value: 5.763
- type: precision_at_100
value: 0.859
- type: precision_at_1000
value: 0.108
- type: precision_at_3
value: 13.71
- type: precision_at_5
value: 9.401
- type: recall_at_1
value: 25.566
- type: recall_at_10
value: 50.563
- type: recall_at_100
value: 72.86399999999999
- type: recall_at_1000
value: 88.68599999999999
- type: recall_at_3
value: 37.43
- type: recall_at_5
value: 41.894999999999996
- type: map_at_1
value: 16.663
- type: map_at_10
value: 23.552
- type: map_at_100
value: 24.538
- type: map_at_1000
value: 24.661
- type: map_at_3
value: 21.085
- type: map_at_5
value: 22.391
- type: mrr_at_1
value: 20.025000000000002
- type: mrr_at_10
value: 27.643
- type: mrr_at_100
value: 28.499999999999996
- type: mrr_at_1000
value: 28.582
- type: mrr_at_3
value: 25.083
- type: mrr_at_5
value: 26.544
- type: ndcg_at_1
value: 20.025000000000002
- type: ndcg_at_10
value: 28.272000000000002
- type: ndcg_at_100
value: 33.353
- type: ndcg_at_1000
value: 36.454
- type: ndcg_at_3
value: 23.579
- type: ndcg_at_5
value: 25.685000000000002
- type: precision_at_1
value: 20.025000000000002
- type: precision_at_10
value: 5.187
- type: precision_at_100
value: 0.897
- type: precision_at_1000
value: 0.13
- type: precision_at_3
value: 10.987
- type: precision_at_5
value: 8.06
- type: recall_at_1
value: 16.663
- type: recall_at_10
value: 38.808
- type: recall_at_100
value: 61.305
- type: recall_at_1000
value: 83.571
- type: recall_at_3
value: 25.907999999999998
- type: recall_at_5
value: 31.214
- type: map_at_1
value: 27.695999999999998
- type: map_at_10
value: 37.018
- type: map_at_100
value: 38.263000000000005
- type: map_at_1000
value: 38.371
- type: map_at_3
value: 34.226
- type: map_at_5
value: 35.809999999999995
- type: mrr_at_1
value: 32.916000000000004
- type: mrr_at_10
value: 42.067
- type: mrr_at_100
value: 42.925000000000004
- type: mrr_at_1000
value: 42.978
- type: mrr_at_3
value: 39.637
- type: mrr_at_5
value: 41.134
- type: ndcg_at_1
value: 32.916000000000004
- type: ndcg_at_10
value: 42.539
- type: ndcg_at_100
value: 47.873
- type: ndcg_at_1000
value: 50.08200000000001
- type: ndcg_at_3
value: 37.852999999999994
- type: ndcg_at_5
value: 40.201
- type: precision_at_1
value: 32.916000000000004
- type: precision_at_10
value: 7.5840000000000005
- type: precision_at_100
value: 1.199
- type: precision_at_1000
value: 0.155
- type: precision_at_3
value: 17.485
- type: precision_at_5
value: 12.512
- type: recall_at_1
value: 27.695999999999998
- type: recall_at_10
value: 53.638
- type: recall_at_100
value: 76.116
- type: recall_at_1000
value: 91.069
- type: recall_at_3
value: 41.13
- type: recall_at_5
value: 46.872
- type: map_at_1
value: 24.108
- type: map_at_10
value: 33.372
- type: map_at_100
value: 34.656
- type: map_at_1000
value: 34.768
- type: map_at_3
value: 30.830999999999996
- type: map_at_5
value: 32.204
- type: mrr_at_1
value: 29.110000000000003
- type: mrr_at_10
value: 37.979
- type: mrr_at_100
value: 38.933
- type: mrr_at_1000
value: 38.988
- type: mrr_at_3
value: 35.731
- type: mrr_at_5
value: 36.963
- type: ndcg_at_1
value: 29.110000000000003
- type: ndcg_at_10
value: 38.635000000000005
- type: ndcg_at_100
value: 44.324999999999996
- type: ndcg_at_1000
value: 46.747
- type: ndcg_at_3
value: 34.37
- type: ndcg_at_5
value: 36.228
- type: precision_at_1
value: 29.110000000000003
- type: precision_at_10
value: 6.963
- type: precision_at_100
value: 1.146
- type: precision_at_1000
value: 0.152
- type: precision_at_3
value: 16.400000000000002
- type: precision_at_5
value: 11.552999999999999
- type: recall_at_1
value: 24.108
- type: recall_at_10
value: 49.597
- type: recall_at_100
value: 73.88900000000001
- type: recall_at_1000
value: 90.62400000000001
- type: recall_at_3
value: 37.662
- type: recall_at_5
value: 42.565
- type: map_at_1
value: 25.00791666666667
- type: map_at_10
value: 33.287749999999996
- type: map_at_100
value: 34.41141666666667
- type: map_at_1000
value: 34.52583333333333
- type: map_at_3
value: 30.734416666666668
- type: map_at_5
value: 32.137166666666666
- type: mrr_at_1
value: 29.305666666666664
- type: mrr_at_10
value: 37.22966666666666
- type: mrr_at_100
value: 38.066583333333334
- type: mrr_at_1000
value: 38.12616666666667
- type: mrr_at_3
value: 34.92275
- type: mrr_at_5
value: 36.23333333333334
- type: ndcg_at_1
value: 29.305666666666664
- type: ndcg_at_10
value: 38.25533333333333
- type: ndcg_at_100
value: 43.25266666666666
- type: ndcg_at_1000
value: 45.63583333333334
- type: ndcg_at_3
value: 33.777166666666666
- type: ndcg_at_5
value: 35.85
- type: precision_at_1
value: 29.305666666666664
- type: precision_at_10
value: 6.596416666666667
- type: precision_at_100
value: 1.0784166666666668
- type: precision_at_1000
value: 0.14666666666666664
- type: precision_at_3
value: 15.31075
- type: precision_at_5
value: 10.830916666666667
- type: recall_at_1
value: 25.00791666666667
- type: recall_at_10
value: 49.10933333333333
- type: recall_at_100
value: 71.09216666666667
- type: recall_at_1000
value: 87.77725000000001
- type: recall_at_3
value: 36.660916666666665
- type: recall_at_5
value: 41.94149999999999
- type: map_at_1
value: 23.521
- type: map_at_10
value: 30.043
- type: map_at_100
value: 30.936000000000003
- type: map_at_1000
value: 31.022
- type: map_at_3
value: 27.926000000000002
- type: map_at_5
value: 29.076999999999998
- type: mrr_at_1
value: 26.227
- type: mrr_at_10
value: 32.822
- type: mrr_at_100
value: 33.61
- type: mrr_at_1000
value: 33.672000000000004
- type: mrr_at_3
value: 30.776999999999997
- type: mrr_at_5
value: 31.866
- type: ndcg_at_1
value: 26.227
- type: ndcg_at_10
value: 34.041
- type: ndcg_at_100
value: 38.394
- type: ndcg_at_1000
value: 40.732
- type: ndcg_at_3
value: 30.037999999999997
- type: ndcg_at_5
value: 31.845000000000002
- type: precision_at_1
value: 26.227
- type: precision_at_10
value: 5.244999999999999
- type: precision_at_100
value: 0.808
- type: precision_at_1000
value: 0.107
- type: precision_at_3
value: 12.679000000000002
- type: precision_at_5
value: 8.773
- type: recall_at_1
value: 23.521
- type: recall_at_10
value: 43.633
- type: recall_at_100
value: 63.126000000000005
- type: recall_at_1000
value: 80.765
- type: recall_at_3
value: 32.614
- type: recall_at_5
value: 37.15
- type: map_at_1
value: 16.236
- type: map_at_10
value: 22.898
- type: map_at_100
value: 23.878
- type: map_at_1000
value: 24.009
- type: map_at_3
value: 20.87
- type: map_at_5
value: 22.025
- type: mrr_at_1
value: 19.339000000000002
- type: mrr_at_10
value: 26.382
- type: mrr_at_100
value: 27.245
- type: mrr_at_1000
value: 27.33
- type: mrr_at_3
value: 24.386
- type: mrr_at_5
value: 25.496000000000002
- type: ndcg_at_1
value: 19.339000000000002
- type: ndcg_at_10
value: 27.139999999999997
- type: ndcg_at_100
value: 31.944
- type: ndcg_at_1000
value: 35.077999999999996
- type: ndcg_at_3
value: 23.424
- type: ndcg_at_5
value: 25.188
- type: precision_at_1
value: 19.339000000000002
- type: precision_at_10
value: 4.8309999999999995
- type: precision_at_100
value: 0.845
- type: precision_at_1000
value: 0.128
- type: precision_at_3
value: 10.874
- type: precision_at_5
value: 7.825
- type: recall_at_1
value: 16.236
- type: recall_at_10
value: 36.513
- type: recall_at_100
value: 57.999
- type: recall_at_1000
value: 80.512
- type: recall_at_3
value: 26.179999999999996
- type: recall_at_5
value: 30.712
- type: map_at_1
value: 24.11
- type: map_at_10
value: 31.566
- type: map_at_100
value: 32.647
- type: map_at_1000
value: 32.753
- type: map_at_3
value: 29.24
- type: map_at_5
value: 30.564999999999998
- type: mrr_at_1
value: 28.265
- type: mrr_at_10
value: 35.504000000000005
- type: mrr_at_100
value: 36.436
- type: mrr_at_1000
value: 36.503
- type: mrr_at_3
value: 33.349000000000004
- type: mrr_at_5
value: 34.622
- type: ndcg_at_1
value: 28.265
- type: ndcg_at_10
value: 36.192
- type: ndcg_at_100
value: 41.388000000000005
- type: ndcg_at_1000
value: 43.948
- type: ndcg_at_3
value: 31.959
- type: ndcg_at_5
value: 33.998
- type: precision_at_1
value: 28.265
- type: precision_at_10
value: 5.989
- type: precision_at_100
value: 0.9650000000000001
- type: precision_at_1000
value: 0.13
- type: precision_at_3
value: 14.335
- type: precision_at_5
value: 10.112
- type: recall_at_1
value: 24.11
- type: recall_at_10
value: 46.418
- type: recall_at_100
value: 69.314
- type: recall_at_1000
value: 87.397
- type: recall_at_3
value: 34.724
- type: recall_at_5
value: 39.925
- type: map_at_1
value: 22.091
- type: map_at_10
value: 29.948999999999998
- type: map_at_100
value: 31.502000000000002
- type: map_at_1000
value: 31.713
- type: map_at_3
value: 27.464
- type: map_at_5
value: 28.968
- type: mrr_at_1
value: 26.482
- type: mrr_at_10
value: 34.009
- type: mrr_at_100
value: 35.081
- type: mrr_at_1000
value: 35.138000000000005
- type: mrr_at_3
value: 31.785000000000004
- type: mrr_at_5
value: 33.178999999999995
- type: ndcg_at_1
value: 26.482
- type: ndcg_at_10
value: 35.008
- type: ndcg_at_100
value: 41.272999999999996
- type: ndcg_at_1000
value: 43.972
- type: ndcg_at_3
value: 30.804
- type: ndcg_at_5
value: 33.046
- type: precision_at_1
value: 26.482
- type: precision_at_10
value: 6.462
- type: precision_at_100
value: 1.431
- type: precision_at_1000
value: 0.22899999999999998
- type: precision_at_3
value: 14.360999999999999
- type: precision_at_5
value: 10.474
- type: recall_at_1
value: 22.091
- type: recall_at_10
value: 45.125
- type: recall_at_100
value: 72.313
- type: recall_at_1000
value: 89.503
- type: recall_at_3
value: 33.158
- type: recall_at_5
value: 39.086999999999996
- type: map_at_1
value: 19.883
- type: map_at_10
value: 26.951000000000004
- type: map_at_100
value: 27.927999999999997
- type: map_at_1000
value: 28.022000000000002
- type: map_at_3
value: 24.616
- type: map_at_5
value: 25.917
- type: mrr_at_1
value: 21.996
- type: mrr_at_10
value: 29.221000000000004
- type: mrr_at_100
value: 30.024
- type: mrr_at_1000
value: 30.095
- type: mrr_at_3
value: 26.833000000000002
- type: mrr_at_5
value: 28.155
- type: ndcg_at_1
value: 21.996
- type: ndcg_at_10
value: 31.421
- type: ndcg_at_100
value: 36.237
- type: ndcg_at_1000
value: 38.744
- type: ndcg_at_3
value: 26.671
- type: ndcg_at_5
value: 28.907
- type: precision_at_1
value: 21.996
- type: precision_at_10
value: 5.009
- type: precision_at_100
value: 0.799
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 11.275
- type: precision_at_5
value: 8.059
- type: recall_at_1
value: 19.883
- type: recall_at_10
value: 43.132999999999996
- type: recall_at_100
value: 65.654
- type: recall_at_1000
value: 84.492
- type: recall_at_3
value: 30.209000000000003
- type: recall_at_5
value: 35.616
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 17.756
- type: map_at_10
value: 30.378
- type: map_at_100
value: 32.537
- type: map_at_1000
value: 32.717
- type: map_at_3
value: 25.599
- type: map_at_5
value: 28.372999999999998
- type: mrr_at_1
value: 41.303
- type: mrr_at_10
value: 53.483999999999995
- type: mrr_at_100
value: 54.106
- type: mrr_at_1000
value: 54.127
- type: mrr_at_3
value: 50.315
- type: mrr_at_5
value: 52.396
- type: ndcg_at_1
value: 41.303
- type: ndcg_at_10
value: 40.503
- type: ndcg_at_100
value: 47.821000000000005
- type: ndcg_at_1000
value: 50.788
- type: ndcg_at_3
value: 34.364
- type: ndcg_at_5
value: 36.818
- type: precision_at_1
value: 41.303
- type: precision_at_10
value: 12.463000000000001
- type: precision_at_100
value: 2.037
- type: precision_at_1000
value: 0.26
- type: precision_at_3
value: 25.798
- type: precision_at_5
value: 19.896
- type: recall_at_1
value: 17.756
- type: recall_at_10
value: 46.102
- type: recall_at_100
value: 70.819
- type: recall_at_1000
value: 87.21799999999999
- type: recall_at_3
value: 30.646
- type: recall_at_5
value: 38.022
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 9.033
- type: map_at_10
value: 20.584
- type: map_at_100
value: 29.518
- type: map_at_1000
value: 31.186000000000003
- type: map_at_3
value: 14.468
- type: map_at_5
value: 17.177
- type: mrr_at_1
value: 69.75
- type: mrr_at_10
value: 77.025
- type: mrr_at_100
value: 77.36699999999999
- type: mrr_at_1000
value: 77.373
- type: mrr_at_3
value: 75.583
- type: mrr_at_5
value: 76.396
- type: ndcg_at_1
value: 58.5
- type: ndcg_at_10
value: 45.033
- type: ndcg_at_100
value: 49.071
- type: ndcg_at_1000
value: 56.056
- type: ndcg_at_3
value: 49.936
- type: ndcg_at_5
value: 47.471999999999994
- type: precision_at_1
value: 69.75
- type: precision_at_10
value: 35.775
- type: precision_at_100
value: 11.594999999999999
- type: precision_at_1000
value: 2.062
- type: precision_at_3
value: 52.5
- type: precision_at_5
value: 45.300000000000004
- type: recall_at_1
value: 9.033
- type: recall_at_10
value: 26.596999999999998
- type: recall_at_100
value: 54.607000000000006
- type: recall_at_1000
value: 76.961
- type: recall_at_3
value: 15.754999999999999
- type: recall_at_5
value: 20.033
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 48.345000000000006
- type: f1
value: 43.4514918068706
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 71.29100000000001
- type: map_at_10
value: 81.059
- type: map_at_100
value: 81.341
- type: map_at_1000
value: 81.355
- type: map_at_3
value: 79.74799999999999
- type: map_at_5
value: 80.612
- type: mrr_at_1
value: 76.40299999999999
- type: mrr_at_10
value: 84.615
- type: mrr_at_100
value: 84.745
- type: mrr_at_1000
value: 84.748
- type: mrr_at_3
value: 83.776
- type: mrr_at_5
value: 84.343
- type: ndcg_at_1
value: 76.40299999999999
- type: ndcg_at_10
value: 84.981
- type: ndcg_at_100
value: 86.00999999999999
- type: ndcg_at_1000
value: 86.252
- type: ndcg_at_3
value: 82.97
- type: ndcg_at_5
value: 84.152
- type: precision_at_1
value: 76.40299999999999
- type: precision_at_10
value: 10.446
- type: precision_at_100
value: 1.1199999999999999
- type: precision_at_1000
value: 0.116
- type: precision_at_3
value: 32.147999999999996
- type: precision_at_5
value: 20.135
- type: recall_at_1
value: 71.29100000000001
- type: recall_at_10
value: 93.232
- type: recall_at_100
value: 97.363
- type: recall_at_1000
value: 98.905
- type: recall_at_3
value: 87.893
- type: recall_at_5
value: 90.804
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 18.667
- type: map_at_10
value: 30.853
- type: map_at_100
value: 32.494
- type: map_at_1000
value: 32.677
- type: map_at_3
value: 26.91
- type: map_at_5
value: 29.099000000000004
- type: mrr_at_1
value: 37.191
- type: mrr_at_10
value: 46.171
- type: mrr_at_100
value: 47.056
- type: mrr_at_1000
value: 47.099000000000004
- type: mrr_at_3
value: 44.059
- type: mrr_at_5
value: 45.147
- type: ndcg_at_1
value: 37.191
- type: ndcg_at_10
value: 38.437
- type: ndcg_at_100
value: 44.62
- type: ndcg_at_1000
value: 47.795
- type: ndcg_at_3
value: 35.003
- type: ndcg_at_5
value: 36.006
- type: precision_at_1
value: 37.191
- type: precision_at_10
value: 10.586
- type: precision_at_100
value: 1.688
- type: precision_at_1000
value: 0.22699999999999998
- type: precision_at_3
value: 23.302
- type: precision_at_5
value: 17.006
- type: recall_at_1
value: 18.667
- type: recall_at_10
value: 45.367000000000004
- type: recall_at_100
value: 68.207
- type: recall_at_1000
value: 87.072
- type: recall_at_3
value: 32.129000000000005
- type: recall_at_5
value: 37.719
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 39.494
- type: map_at_10
value: 66.223
- type: map_at_100
value: 67.062
- type: map_at_1000
value: 67.11500000000001
- type: map_at_3
value: 62.867
- type: map_at_5
value: 64.994
- type: mrr_at_1
value: 78.987
- type: mrr_at_10
value: 84.585
- type: mrr_at_100
value: 84.773
- type: mrr_at_1000
value: 84.77900000000001
- type: mrr_at_3
value: 83.592
- type: mrr_at_5
value: 84.235
- type: ndcg_at_1
value: 78.987
- type: ndcg_at_10
value: 73.64
- type: ndcg_at_100
value: 76.519
- type: ndcg_at_1000
value: 77.51
- type: ndcg_at_3
value: 68.893
- type: ndcg_at_5
value: 71.585
- type: precision_at_1
value: 78.987
- type: precision_at_10
value: 15.529000000000002
- type: precision_at_100
value: 1.7770000000000001
- type: precision_at_1000
value: 0.191
- type: precision_at_3
value: 44.808
- type: precision_at_5
value: 29.006999999999998
- type: recall_at_1
value: 39.494
- type: recall_at_10
value: 77.643
- type: recall_at_100
value: 88.825
- type: recall_at_1000
value: 95.321
- type: recall_at_3
value: 67.211
- type: recall_at_5
value: 72.519
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 85.55959999999999
- type: ap
value: 80.7246500384617
- type: f1
value: 85.52336485065454
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 23.631
- type: map_at_10
value: 36.264
- type: map_at_100
value: 37.428
- type: map_at_1000
value: 37.472
- type: map_at_3
value: 32.537
- type: map_at_5
value: 34.746
- type: mrr_at_1
value: 24.312
- type: mrr_at_10
value: 36.858000000000004
- type: mrr_at_100
value: 37.966
- type: mrr_at_1000
value: 38.004
- type: mrr_at_3
value: 33.188
- type: mrr_at_5
value: 35.367
- type: ndcg_at_1
value: 24.312
- type: ndcg_at_10
value: 43.126999999999995
- type: ndcg_at_100
value: 48.642
- type: ndcg_at_1000
value: 49.741
- type: ndcg_at_3
value: 35.589
- type: ndcg_at_5
value: 39.515
- type: precision_at_1
value: 24.312
- type: precision_at_10
value: 6.699
- type: precision_at_100
value: 0.9450000000000001
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 15.153
- type: precision_at_5
value: 11.065999999999999
- type: recall_at_1
value: 23.631
- type: recall_at_10
value: 64.145
- type: recall_at_100
value: 89.41
- type: recall_at_1000
value: 97.83500000000001
- type: recall_at_3
value: 43.769000000000005
- type: recall_at_5
value: 53.169
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 93.4108527131783
- type: f1
value: 93.1415880261038
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 77.24806201550388
- type: f1
value: 60.531916308197175
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 73.71553463349024
- type: f1
value: 71.70753174900791
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 77.79757901815736
- type: f1
value: 77.83719850433258
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 33.74193296622113
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 30.64257594108566
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 30.811018518883625
- type: mrr
value: 31.910376577445003
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.409
- type: map_at_10
value: 13.093
- type: map_at_100
value: 16.256999999999998
- type: map_at_1000
value: 17.617
- type: map_at_3
value: 9.555
- type: map_at_5
value: 11.428
- type: mrr_at_1
value: 45.201
- type: mrr_at_10
value: 54.179
- type: mrr_at_100
value: 54.812000000000005
- type: mrr_at_1000
value: 54.840999999999994
- type: mrr_at_3
value: 51.909000000000006
- type: mrr_at_5
value: 53.519000000000005
- type: ndcg_at_1
value: 43.189
- type: ndcg_at_10
value: 35.028
- type: ndcg_at_100
value: 31.226
- type: ndcg_at_1000
value: 39.678000000000004
- type: ndcg_at_3
value: 40.596
- type: ndcg_at_5
value: 38.75
- type: precision_at_1
value: 44.582
- type: precision_at_10
value: 25.974999999999998
- type: precision_at_100
value: 7.793
- type: precision_at_1000
value: 2.036
- type: precision_at_3
value: 38.493
- type: precision_at_5
value: 33.994
- type: recall_at_1
value: 5.409
- type: recall_at_10
value: 16.875999999999998
- type: recall_at_100
value: 30.316
- type: recall_at_1000
value: 60.891
- type: recall_at_3
value: 10.688
- type: recall_at_5
value: 13.832
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 36.375
- type: map_at_10
value: 51.991
- type: map_at_100
value: 52.91400000000001
- type: map_at_1000
value: 52.93600000000001
- type: map_at_3
value: 48.014
- type: map_at_5
value: 50.381
- type: mrr_at_1
value: 40.759
- type: mrr_at_10
value: 54.617000000000004
- type: mrr_at_100
value: 55.301
- type: mrr_at_1000
value: 55.315000000000005
- type: mrr_at_3
value: 51.516
- type: mrr_at_5
value: 53.435
- type: ndcg_at_1
value: 40.759
- type: ndcg_at_10
value: 59.384
- type: ndcg_at_100
value: 63.157
- type: ndcg_at_1000
value: 63.654999999999994
- type: ndcg_at_3
value: 52.114000000000004
- type: ndcg_at_5
value: 55.986000000000004
- type: precision_at_1
value: 40.759
- type: precision_at_10
value: 9.411999999999999
- type: precision_at_100
value: 1.153
- type: precision_at_1000
value: 0.12
- type: precision_at_3
value: 23.329
- type: precision_at_5
value: 16.256999999999998
- type: recall_at_1
value: 36.375
- type: recall_at_10
value: 79.053
- type: recall_at_100
value: 95.167
- type: recall_at_1000
value: 98.82
- type: recall_at_3
value: 60.475
- type: recall_at_5
value: 69.327
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 70.256
- type: map_at_10
value: 83.8
- type: map_at_100
value: 84.425
- type: map_at_1000
value: 84.444
- type: map_at_3
value: 80.906
- type: map_at_5
value: 82.717
- type: mrr_at_1
value: 80.97999999999999
- type: mrr_at_10
value: 87.161
- type: mrr_at_100
value: 87.262
- type: mrr_at_1000
value: 87.263
- type: mrr_at_3
value: 86.175
- type: mrr_at_5
value: 86.848
- type: ndcg_at_1
value: 80.97999999999999
- type: ndcg_at_10
value: 87.697
- type: ndcg_at_100
value: 88.959
- type: ndcg_at_1000
value: 89.09899999999999
- type: ndcg_at_3
value: 84.83800000000001
- type: ndcg_at_5
value: 86.401
- type: precision_at_1
value: 80.97999999999999
- type: precision_at_10
value: 13.261000000000001
- type: precision_at_100
value: 1.5150000000000001
- type: precision_at_1000
value: 0.156
- type: precision_at_3
value: 37.01
- type: precision_at_5
value: 24.298000000000002
- type: recall_at_1
value: 70.256
- type: recall_at_10
value: 94.935
- type: recall_at_100
value: 99.274
- type: recall_at_1000
value: 99.928
- type: recall_at_3
value: 86.602
- type: recall_at_5
value: 91.133
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 56.322692497613104
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 61.895813503775074
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.338
- type: map_at_10
value: 10.767
- type: map_at_100
value: 12.537999999999998
- type: map_at_1000
value: 12.803999999999998
- type: map_at_3
value: 7.788
- type: map_at_5
value: 9.302000000000001
- type: mrr_at_1
value: 21.4
- type: mrr_at_10
value: 31.637999999999998
- type: mrr_at_100
value: 32.688
- type: mrr_at_1000
value: 32.756
- type: mrr_at_3
value: 28.433000000000003
- type: mrr_at_5
value: 30.178
- type: ndcg_at_1
value: 21.4
- type: ndcg_at_10
value: 18.293
- type: ndcg_at_100
value: 25.274
- type: ndcg_at_1000
value: 30.284
- type: ndcg_at_3
value: 17.391000000000002
- type: ndcg_at_5
value: 15.146999999999998
- type: precision_at_1
value: 21.4
- type: precision_at_10
value: 9.48
- type: precision_at_100
value: 1.949
- type: precision_at_1000
value: 0.316
- type: precision_at_3
value: 16.167
- type: precision_at_5
value: 13.22
- type: recall_at_1
value: 4.338
- type: recall_at_10
value: 19.213
- type: recall_at_100
value: 39.562999999999995
- type: recall_at_1000
value: 64.08
- type: recall_at_3
value: 9.828000000000001
- type: recall_at_5
value: 13.383000000000001
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 82.42568163642142
- type: cos_sim_spearman
value: 78.5797159641342
- type: euclidean_pearson
value: 80.22151260811604
- type: euclidean_spearman
value: 78.5797151953878
- type: manhattan_pearson
value: 80.21224215864788
- type: manhattan_spearman
value: 78.55641478381344
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 85.44020710812569
- type: cos_sim_spearman
value: 78.91631735081286
- type: euclidean_pearson
value: 81.64188964182102
- type: euclidean_spearman
value: 78.91633286881678
- type: manhattan_pearson
value: 81.69294748512496
- type: manhattan_spearman
value: 78.93438558002656
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 84.27165426412311
- type: cos_sim_spearman
value: 85.40429140249618
- type: euclidean_pearson
value: 84.7509580724893
- type: euclidean_spearman
value: 85.40429140249618
- type: manhattan_pearson
value: 84.76488289321308
- type: manhattan_spearman
value: 85.4256793698708
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 83.138851760732
- type: cos_sim_spearman
value: 81.64101363896586
- type: euclidean_pearson
value: 82.55165038934942
- type: euclidean_spearman
value: 81.64105257080502
- type: manhattan_pearson
value: 82.52802949883335
- type: manhattan_spearman
value: 81.61255430718158
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 86.0654695484029
- type: cos_sim_spearman
value: 87.20408521902229
- type: euclidean_pearson
value: 86.8110651362115
- type: euclidean_spearman
value: 87.20408521902229
- type: manhattan_pearson
value: 86.77984656478691
- type: manhattan_spearman
value: 87.1719947099227
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 83.77823915496512
- type: cos_sim_spearman
value: 85.43566325729779
- type: euclidean_pearson
value: 84.5396956658821
- type: euclidean_spearman
value: 85.43566325729779
- type: manhattan_pearson
value: 84.5665398848169
- type: manhattan_spearman
value: 85.44375870303232
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 87.20030208471798
- type: cos_sim_spearman
value: 87.20485505076539
- type: euclidean_pearson
value: 88.10588324368722
- type: euclidean_spearman
value: 87.20485505076539
- type: manhattan_pearson
value: 87.92324770415183
- type: manhattan_spearman
value: 87.0571314561877
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 63.06093161604453
- type: cos_sim_spearman
value: 64.2163140357722
- type: euclidean_pearson
value: 65.27589680994006
- type: euclidean_spearman
value: 64.2163140357722
- type: manhattan_pearson
value: 65.45904383711101
- type: manhattan_spearman
value: 64.55404716679305
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 84.32976164578706
- type: cos_sim_spearman
value: 85.54302197678368
- type: euclidean_pearson
value: 85.26307149193056
- type: euclidean_spearman
value: 85.54302197678368
- type: manhattan_pearson
value: 85.26647282029371
- type: manhattan_spearman
value: 85.5316135265568
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 81.44675968318754
- type: mrr
value: 94.92741826075158
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 56.34400000000001
- type: map_at_10
value: 65.927
- type: map_at_100
value: 66.431
- type: map_at_1000
value: 66.461
- type: map_at_3
value: 63.529
- type: map_at_5
value: 64.818
- type: mrr_at_1
value: 59.333000000000006
- type: mrr_at_10
value: 67.54599999999999
- type: mrr_at_100
value: 67.892
- type: mrr_at_1000
value: 67.917
- type: mrr_at_3
value: 65.778
- type: mrr_at_5
value: 66.794
- type: ndcg_at_1
value: 59.333000000000006
- type: ndcg_at_10
value: 70.5
- type: ndcg_at_100
value: 72.688
- type: ndcg_at_1000
value: 73.483
- type: ndcg_at_3
value: 66.338
- type: ndcg_at_5
value: 68.265
- type: precision_at_1
value: 59.333000000000006
- type: precision_at_10
value: 9.3
- type: precision_at_100
value: 1.053
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 25.889
- type: precision_at_5
value: 16.866999999999997
- type: recall_at_1
value: 56.34400000000001
- type: recall_at_10
value: 82.789
- type: recall_at_100
value: 92.767
- type: recall_at_1000
value: 99
- type: recall_at_3
value: 71.64399999999999
- type: recall_at_5
value: 76.322
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.75742574257426
- type: cos_sim_ap
value: 93.52081548447406
- type: cos_sim_f1
value: 87.33850129198966
- type: cos_sim_precision
value: 90.37433155080214
- type: cos_sim_recall
value: 84.5
- type: dot_accuracy
value: 99.75742574257426
- type: dot_ap
value: 93.52081548447406
- type: dot_f1
value: 87.33850129198966
- type: dot_precision
value: 90.37433155080214
- type: dot_recall
value: 84.5
- type: euclidean_accuracy
value: 99.75742574257426
- type: euclidean_ap
value: 93.52081548447406
- type: euclidean_f1
value: 87.33850129198966
- type: euclidean_precision
value: 90.37433155080214
- type: euclidean_recall
value: 84.5
- type: manhattan_accuracy
value: 99.75841584158415
- type: manhattan_ap
value: 93.4975678585854
- type: manhattan_f1
value: 87.26708074534162
- type: manhattan_precision
value: 90.45064377682404
- type: manhattan_recall
value: 84.3
- type: max_accuracy
value: 99.75841584158415
- type: max_ap
value: 93.52081548447406
- type: max_f1
value: 87.33850129198966
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 64.31437036686651
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 33.25569319007206
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 49.90474939720706
- type: mrr
value: 50.568115503777264
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 29.866828641244712
- type: cos_sim_spearman
value: 30.077555055873866
- type: dot_pearson
value: 29.866832988572266
- type: dot_spearman
value: 30.077555055873866
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.232
- type: map_at_10
value: 2.094
- type: map_at_100
value: 11.971
- type: map_at_1000
value: 28.158
- type: map_at_3
value: 0.688
- type: map_at_5
value: 1.114
- type: mrr_at_1
value: 88
- type: mrr_at_10
value: 93.4
- type: mrr_at_100
value: 93.4
- type: mrr_at_1000
value: 93.4
- type: mrr_at_3
value: 93
- type: mrr_at_5
value: 93.4
- type: ndcg_at_1
value: 84
- type: ndcg_at_10
value: 79.923
- type: ndcg_at_100
value: 61.17
- type: ndcg_at_1000
value: 53.03
- type: ndcg_at_3
value: 84.592
- type: ndcg_at_5
value: 82.821
- type: precision_at_1
value: 88
- type: precision_at_10
value: 85
- type: precision_at_100
value: 63.019999999999996
- type: precision_at_1000
value: 23.554
- type: precision_at_3
value: 89.333
- type: precision_at_5
value: 87.2
- type: recall_at_1
value: 0.232
- type: recall_at_10
value: 2.255
- type: recall_at_100
value: 14.823
- type: recall_at_1000
value: 49.456
- type: recall_at_3
value: 0.718
- type: recall_at_5
value: 1.175
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 2.547
- type: map_at_10
value: 11.375
- type: map_at_100
value: 18.194
- type: map_at_1000
value: 19.749
- type: map_at_3
value: 5.825
- type: map_at_5
value: 8.581
- type: mrr_at_1
value: 32.653
- type: mrr_at_10
value: 51.32
- type: mrr_at_100
value: 51.747
- type: mrr_at_1000
value: 51.747
- type: mrr_at_3
value: 47.278999999999996
- type: mrr_at_5
value: 48.605
- type: ndcg_at_1
value: 29.592000000000002
- type: ndcg_at_10
value: 28.151
- type: ndcg_at_100
value: 39.438
- type: ndcg_at_1000
value: 50.769
- type: ndcg_at_3
value: 30.758999999999997
- type: ndcg_at_5
value: 30.366
- type: precision_at_1
value: 32.653
- type: precision_at_10
value: 25.714
- type: precision_at_100
value: 8.041
- type: precision_at_1000
value: 1.555
- type: precision_at_3
value: 33.333
- type: precision_at_5
value: 31.837
- type: recall_at_1
value: 2.547
- type: recall_at_10
value: 18.19
- type: recall_at_100
value: 49.538
- type: recall_at_1000
value: 83.86
- type: recall_at_3
value: 7.329
- type: recall_at_5
value: 11.532
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 71.4952
- type: ap
value: 14.793362635531409
- type: f1
value: 55.204635551516915
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 61.5365025466893
- type: f1
value: 61.81742556334845
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 49.05531070301185
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 86.51725576682364
- type: cos_sim_ap
value: 75.2292304265163
- type: cos_sim_f1
value: 69.54022988505749
- type: cos_sim_precision
value: 63.65629110039457
- type: cos_sim_recall
value: 76.62269129287598
- type: dot_accuracy
value: 86.51725576682364
- type: dot_ap
value: 75.22922386081054
- type: dot_f1
value: 69.54022988505749
- type: dot_precision
value: 63.65629110039457
- type: dot_recall
value: 76.62269129287598
- type: euclidean_accuracy
value: 86.51725576682364
- type: euclidean_ap
value: 75.22925730473472
- type: euclidean_f1
value: 69.54022988505749
- type: euclidean_precision
value: 63.65629110039457
- type: euclidean_recall
value: 76.62269129287598
- type: manhattan_accuracy
value: 86.52321630804077
- type: manhattan_ap
value: 75.20608115037336
- type: manhattan_f1
value: 69.60000000000001
- type: manhattan_precision
value: 64.37219730941705
- type: manhattan_recall
value: 75.75197889182058
- type: max_accuracy
value: 86.52321630804077
- type: max_ap
value: 75.22925730473472
- type: max_f1
value: 69.60000000000001
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.34877944657896
- type: cos_sim_ap
value: 86.71257569277373
- type: cos_sim_f1
value: 79.10386355986088
- type: cos_sim_precision
value: 76.91468470434214
- type: cos_sim_recall
value: 81.4213119802895
- type: dot_accuracy
value: 89.34877944657896
- type: dot_ap
value: 86.71257133133368
- type: dot_f1
value: 79.10386355986088
- type: dot_precision
value: 76.91468470434214
- type: dot_recall
value: 81.4213119802895
- type: euclidean_accuracy
value: 89.34877944657896
- type: euclidean_ap
value: 86.71257651501476
- type: euclidean_f1
value: 79.10386355986088
- type: euclidean_precision
value: 76.91468470434214
- type: euclidean_recall
value: 81.4213119802895
- type: manhattan_accuracy
value: 89.35848177901967
- type: manhattan_ap
value: 86.69330615469126
- type: manhattan_f1
value: 79.13867741453949
- type: manhattan_precision
value: 76.78881807647741
- type: manhattan_recall
value: 81.63689559593472
- type: max_accuracy
value: 89.35848177901967
- type: max_ap
value: 86.71257651501476
- type: max_f1
value: 79.13867741453949
---
# nomic-embed-text-v1: A Reproducible Long Context (8192) Text Embedder
`nomic-embed-text-v1` is 8192 context length text encoder that surpasses OpenAI text-embedding-ada-002 and text-embedding-3-small performance on short and long context tasks.
| Name | SeqLen | MTEB | LoCo | Jina Long Context | Open Weights | Open Training Code | Open Data |
| :-------------------------------:| :----- | :-------- | :------: | :---------------: | :-----------: | :----------------: | :---------- |
| nomic-embed-text-v1 | 8192 | **62.39** |**85.53** | 54.16 | ✅ | ✅ | ✅ |
| jina-embeddings-v2-base-en | 8192 | 60.39 | 85.45 | 51.90 | ✅ | ❌ | ❌ |
| text-embedding-3-small | 8191 | 62.26 | 82.40 | **58.20** | ❌ | ❌ | ❌ |
| text-embedding-ada-002 | 8191 | 60.99 | 52.7 | 55.25 | ❌ | ❌ | ❌ |
## Hosted Inference API
The easiest way to get started with Nomic Embed is through the Nomic Embedding API.
Generating embeddings with the `nomic` Python client is as easy as
```python
from nomic import embed
output = embed.text(
texts=['Nomic Embedding API', '#keepAIOpen'],
model='nomic-embed-text-v1',
task_type='search_document'
)
print(output)
```
For more information, see the [API reference](https://docs.nomic.ai/reference/endpoints/nomic-embed-text)
## Data Visualization
Click the Nomic Atlas map below to visualize a 5M sample of our contrastive pretraining data!
[](https://atlas.nomic.ai/map/nomic-text-embed-v1-5m-sample)
## Training Details
We train our embedder using a multi-stage training pipeline. Starting from a long-context [BERT model](https://huggingface.co/nomic-ai/nomic-bert-2048),
the first unsupervised contrastive stage trains on a dataset generated from weakly related text pairs, such as question-answer pairs from forums like StackExchange and Quora, title-body pairs from Amazon reviews, and summarizations from news articles.
In the second finetuning stage, higher quality labeled datasets such as search queries and answers from web searches are leveraged. Data curation and hard-example mining is crucial in this stage.
For more details, see the Nomic Embed [Technical Report](https://static.nomic.ai/reports/2024_Nomic_Embed_Text_Technical_Report.pdf) and corresponding [blog post](https://blog.nomic.ai/posts/nomic-embed-text-v1).
Training data to train the models is released in its entirety. For more details, see the `contrastors` [repository](https://github.com/nomic-ai/contrastors)
## Usage
Note `nomic-embed-text` *requires* prefixes! We support the prefixes `[search_query, search_document, classification, clustering]`.
For retrieval applications, you should prepend `search_document` for all your documents and `search_query` for your queries.
For example, you are building a RAG application over the top of Wikipedia. You would embed all Wikipedia articles with the prefix `search_document`
and any questions you ask with `search_query`. For example:
```python
queries = ["search_query: who is the first president of the united states?", "search_query: when was babe ruth born?"]
documents = ["search_document: <article about US Presidents>", "search_document: <article about Babe Ruth>"]
```
### Sentence Transformers
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer("nomic-ai/nomic-embed-text-v1", trust_remote_code=True)
sentences = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?']
embeddings = model.encode(sentences)
print(embeddings)
```
### Transformers
```python
import torch
import torch.nn.functional as F
from transformers import AutoTokenizer, AutoModel
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0]
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
sentences = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?']
tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased')
model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True)
model.eval()
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
with torch.no_grad():
model_output = model(**encoded_input)
embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
embeddings = F.normalize(embeddings, p=2, dim=1)
print(embeddings)
```
The model natively supports scaling of the sequence length past 2048 tokens. To do so,
```diff
- tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased')
+ tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased', model_max_length=8192)
- model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True)
+ model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True, rotary_scaling_factor=2)
```
### Transformers.js
```js
import { pipeline } from '@xenova/transformers';
// Create a feature extraction pipeline
const extractor = await pipeline('feature-extraction', 'nomic-ai/nomic-embed-text-v1', {
quantized: false, // Comment out this line to use the quantized version
});
// Compute sentence embeddings
const texts = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?'];
const embeddings = await extractor(texts, { pooling: 'mean', normalize: true });
console.log(embeddings);
```
# Join the Nomic Community
- Nomic: [https://nomic.ai](https://nomic.ai)
- Discord: [https://discord.gg/myY5YDR8z8](https://discord.gg/myY5YDR8z8)
- Twitter: [https://twitter.com/nomic_ai](https://twitter.com/nomic_ai)
# Citation
If you find the model, dataset, or training code useful, please cite our work
```bibtex
@misc{nussbaum2024nomic,
title={Nomic Embed: Training a Reproducible Long Context Text Embedder},
author={Zach Nussbaum and John X. Morris and Brandon Duderstadt and Andriy Mulyar},
year={2024},
eprint={2402.01613},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | [
"SUMMARIZATION"
]
| [
"BIOSSES",
"SCIFACT"
]
| Non_BioNLP |
# nomic-embed-text-v1: A Reproducible Long Context (8192) Text Embedder
`nomic-embed-text-v1` is 8192 context length text encoder that surpasses OpenAI text-embedding-ada-002 and text-embedding-3-small performance on short and long context tasks.
| Name | SeqLen | MTEB | LoCo | Jina Long Context | Open Weights | Open Training Code | Open Data |
| :-------------------------------:| :----- | :-------- | :------: | :---------------: | :-----------: | :----------------: | :---------- |
| nomic-embed-text-v1 | 8192 | **62.39** |**85.53** | 54.16 | ✅ | ✅ | ✅ |
| jina-embeddings-v2-base-en | 8192 | 60.39 | 85.45 | 51.90 | ✅ | ❌ | ❌ |
| text-embedding-3-small | 8191 | 62.26 | 82.40 | **58.20** | ❌ | ❌ | ❌ |
| text-embedding-ada-002 | 8191 | 60.99 | 52.7 | 55.25 | ❌ | ❌ | ❌ |
## Hosted Inference API
The easiest way to get started with Nomic Embed is through the Nomic Embedding API.
Generating embeddings with the `nomic` Python client is as easy as
```python
from nomic import embed
output = embed.text(
texts=['Nomic Embedding API', '#keepAIOpen'],
model='nomic-embed-text-v1',
task_type='search_document'
)
print(output)
```
For more information, see the [API reference](https://docs.nomic.ai/reference/endpoints/nomic-embed-text)
## Data Visualization
Click the Nomic Atlas map below to visualize a 5M sample of our contrastive pretraining data!
[](https://atlas.nomic.ai/map/nomic-text-embed-v1-5m-sample)
## Training Details
We train our embedder using a multi-stage training pipeline. Starting from a long-context [BERT model](https://huggingface.co/nomic-ai/nomic-bert-2048),
the first unsupervised contrastive stage trains on a dataset generated from weakly related text pairs, such as question-answer pairs from forums like StackExchange and Quora, title-body pairs from Amazon reviews, and summarizations from news articles.
In the second finetuning stage, higher quality labeled datasets such as search queries and answers from web searches are leveraged. Data curation and hard-example mining is crucial in this stage.
For more details, see the Nomic Embed [Technical Report](https://static.nomic.ai/reports/2024_Nomic_Embed_Text_Technical_Report.pdf) and corresponding [blog post](https://blog.nomic.ai/posts/nomic-embed-text-v1).
Training data to train the models is released in its entirety. For more details, see the `contrastors` [repository](https://github.com/nomic-ai/contrastors)
## Usage
Note `nomic-embed-text` *requires* prefixes! We support the prefixes `[search_query, search_document, classification, clustering]`.
For retrieval applications, you should prepend `search_document` for all your documents and `search_query` for your queries.
For example, you are building a RAG application over the top of Wikipedia. You would embed all Wikipedia articles with the prefix `search_document`
and any questions you ask with `search_query`. For example:
```python
queries = ["search_query: who is the first president of the united states?", "search_query: when was babe ruth born?"]
documents = ["search_document: <article about US Presidents>", "search_document: <article about Babe Ruth>"]
```
### Sentence Transformers
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer("nomic-ai/nomic-embed-text-v1", trust_remote_code=True)
sentences = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?']
embeddings = model.encode(sentences)
print(embeddings)
```
### Transformers
```python
import torch
import torch.nn.functional as F
from transformers import AutoTokenizer, AutoModel
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0]
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
sentences = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?']
tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased')
model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True)
model.eval()
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
with torch.no_grad():
model_output = model(**encoded_input)
embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
embeddings = F.normalize(embeddings, p=2, dim=1)
print(embeddings)
```
The model natively supports scaling of the sequence length past 2048 tokens. To do so,
```diff
- tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased')
+ tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased', model_max_length=8192)
- model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True)
+ model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True, rotary_scaling_factor=2)
```
### Transformers.js
```js
import { pipeline } from '@xenova/transformers';
// Create a feature extraction pipeline
const extractor = await pipeline('feature-extraction', 'nomic-ai/nomic-embed-text-v1', {
quantized: false, // Comment out this line to use the quantized version
});
// Compute sentence embeddings
const texts = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?'];
const embeddings = await extractor(texts, { pooling: 'mean', normalize: true });
console.log(embeddings);
```
# Join the Nomic Community
- Nomic: [https://nomic.ai](https://nomic.ai)
- Discord: [https://discord.gg/myY5YDR8z8](https://discord.gg/myY5YDR8z8)
- Twitter: [https://twitter.com/nomic_ai](https://twitter.com/nomic_ai)
# Citation
If you find the model, dataset, or training code useful, please cite our work
```bibtex
@misc{nussbaum2024nomic,
title={Nomic Embed: Training a Reproducible Long Context Text Embedder},
author={Zach Nussbaum and John X. Morris and Brandon Duderstadt and Andriy Mulyar},
year={2024},
eprint={2402.01613},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | {"language": ["en"], "library_name": "sentence-transformers", "license": "apache-2.0", "pipeline_tag": "sentence-similarity", "tags": ["feature-extraction", "sentence-similarity", "mteb", "transformers", "transformers.js"], "model-index": [{"name": "epoch_0_model", "results": [{"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonCounterfactualClassification (en)", "type": "mteb/amazon_counterfactual", "config": "en", "split": "test", "revision": "e8379541af4e31359cca9fbcf4b00f2671dba205"}, "metrics": [{"type": "accuracy", "value": 76.8507462686567}, {"type": "ap", "value": 40.592189159090495}, {"type": "f1", "value": 71.01634655512476}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonPolarityClassification", "type": "mteb/amazon_polarity", "config": "default", "split": "test", "revision": "e2d317d38cd51312af73b3d32a06d1a08b442046"}, "metrics": [{"type": "accuracy", "value": 91.51892500000001}, {"type": "ap", "value": 88.50346762975335}, {"type": "f1", "value": 91.50342077459624}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification (en)", "type": "mteb/amazon_reviews_multi", "config": "en", "split": "test", "revision": "1399c76144fd37290681b995c656ef9b2e06e26d"}, "metrics": [{"type": "accuracy", "value": 47.364}, {"type": "f1", "value": 46.72708080922794}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ArguAna", "type": "arguana", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 25.178}, {"type": "map_at_10", "value": 40.244}, {"type": "map_at_100", "value": 41.321999999999996}, {"type": "map_at_1000", "value": 41.331}, {"type": "map_at_3", "value": 35.016999999999996}, {"type": "map_at_5", "value": 37.99}, {"type": "mrr_at_1", "value": 25.605}, {"type": "mrr_at_10", "value": 40.422000000000004}, {"type": "mrr_at_100", "value": 41.507}, {"type": "mrr_at_1000", "value": 41.516}, {"type": "mrr_at_3", "value": 35.23}, {"type": "mrr_at_5", "value": 38.15}, {"type": "ndcg_at_1", "value": 25.178}, {"type": "ndcg_at_10", "value": 49.258}, {"type": "ndcg_at_100", "value": 53.776}, {"type": "ndcg_at_1000", "value": 53.995000000000005}, {"type": "ndcg_at_3", "value": 38.429}, {"type": "ndcg_at_5", "value": 43.803}, {"type": "precision_at_1", "value": 25.178}, {"type": "precision_at_10", "value": 7.831}, {"type": "precision_at_100", "value": 0.979}, {"type": "precision_at_1000", "value": 0.1}, {"type": "precision_at_3", "value": 16.121}, {"type": "precision_at_5", "value": 12.29}, {"type": "recall_at_1", "value": 25.178}, {"type": "recall_at_10", "value": 78.307}, {"type": "recall_at_100", "value": 97.866}, {"type": "recall_at_1000", "value": 99.57300000000001}, {"type": "recall_at_3", "value": 48.364000000000004}, {"type": "recall_at_5", "value": 61.451}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringP2P", "type": "mteb/arxiv-clustering-p2p", "config": "default", "split": "test", "revision": "a122ad7f3f0291bf49cc6f4d32aa80929df69d5d"}, "metrics": [{"type": "v_measure", "value": 45.93034494751465}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringS2S", "type": "mteb/arxiv-clustering-s2s", "config": "default", "split": "test", "revision": "f910caf1a6075f7329cdf8c1a6135696f37dbd53"}, "metrics": [{"type": "v_measure", "value": 36.64579480054327}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB AskUbuntuDupQuestions", "type": "mteb/askubuntudupquestions-reranking", "config": "default", "split": "test", "revision": "2000358ca161889fa9c082cb41daa8dcfb161a54"}, "metrics": [{"type": "map", "value": 60.601310529222054}, {"type": "mrr", "value": 75.04484896451656}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB BIOSSES", "type": "mteb/biosses-sts", "config": "default", "split": "test", "revision": "d3fb88f8f02e40887cd149695127462bbcf29b4a"}, "metrics": [{"type": "cos_sim_pearson", "value": 88.57797718095814}, {"type": "cos_sim_spearman", "value": 86.47064499110101}, {"type": "euclidean_pearson", "value": 87.4559602783142}, {"type": "euclidean_spearman", "value": 86.47064499110101}, {"type": "manhattan_pearson", "value": 87.7232764230245}, {"type": "manhattan_spearman", "value": 86.91222131777742}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB Banking77Classification", "type": "mteb/banking77", "config": "default", "split": "test", "revision": "0fd18e25b25c072e09e0d92ab615fda904d66300"}, "metrics": [{"type": "accuracy", "value": 84.5422077922078}, {"type": "f1", "value": 84.47657456950589}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringP2P", "type": "mteb/biorxiv-clustering-p2p", "config": "default", "split": "test", "revision": "65b79d1d13f80053f67aca9498d9402c2d9f1f40"}, "metrics": [{"type": "v_measure", "value": 38.48953561974464}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringS2S", "type": "mteb/biorxiv-clustering-s2s", "config": "default", "split": "test", "revision": "258694dd0231531bc1fd9de6ceb52a0853c6d908"}, "metrics": [{"type": "v_measure", "value": 32.75995857510105}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackAndroidRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 30.008000000000003}, {"type": "map_at_10", "value": 39.51}, {"type": "map_at_100", "value": 40.841}, {"type": "map_at_1000", "value": 40.973}, {"type": "map_at_3", "value": 36.248999999999995}, {"type": "map_at_5", "value": 38.096999999999994}, {"type": "mrr_at_1", "value": 36.481}, {"type": "mrr_at_10", "value": 44.818000000000005}, {"type": "mrr_at_100", "value": 45.64}, {"type": "mrr_at_1000", "value": 45.687}, {"type": "mrr_at_3", "value": 42.036}, {"type": "mrr_at_5", "value": 43.782}, {"type": "ndcg_at_1", "value": 36.481}, {"type": "ndcg_at_10", "value": 45.152}, {"type": "ndcg_at_100", "value": 50.449}, {"type": "ndcg_at_1000", "value": 52.76499999999999}, {"type": "ndcg_at_3", "value": 40.161}, {"type": "ndcg_at_5", "value": 42.577999999999996}, {"type": "precision_at_1", "value": 36.481}, {"type": "precision_at_10", "value": 8.369}, {"type": "precision_at_100", "value": 1.373}, {"type": "precision_at_1000", "value": 0.186}, {"type": "precision_at_3", "value": 18.693}, {"type": "precision_at_5", "value": 13.533999999999999}, {"type": "recall_at_1", "value": 30.008000000000003}, {"type": "recall_at_10", "value": 56.108999999999995}, {"type": "recall_at_100", "value": 78.55499999999999}, {"type": "recall_at_1000", "value": 93.659}, {"type": "recall_at_3", "value": 41.754999999999995}, {"type": "recall_at_5", "value": 48.296}, {"type": "map_at_1", "value": 30.262}, {"type": "map_at_10", "value": 40.139}, {"type": "map_at_100", "value": 41.394}, {"type": "map_at_1000", "value": 41.526}, {"type": "map_at_3", "value": 37.155}, {"type": "map_at_5", "value": 38.785}, {"type": "mrr_at_1", "value": 38.153}, {"type": "mrr_at_10", "value": 46.369}, {"type": "mrr_at_100", "value": 47.072}, {"type": "mrr_at_1000", "value": 47.111999999999995}, {"type": "mrr_at_3", "value": 44.268}, {"type": "mrr_at_5", "value": 45.389}, {"type": "ndcg_at_1", "value": 38.153}, {"type": "ndcg_at_10", "value": 45.925}, {"type": "ndcg_at_100", "value": 50.394000000000005}, {"type": "ndcg_at_1000", "value": 52.37500000000001}, {"type": "ndcg_at_3", "value": 41.754000000000005}, {"type": "ndcg_at_5", "value": 43.574}, {"type": "precision_at_1", "value": 38.153}, {"type": "precision_at_10", "value": 8.796}, {"type": "precision_at_100", "value": 1.432}, {"type": "precision_at_1000", "value": 0.189}, {"type": "precision_at_3", "value": 20.318}, {"type": "precision_at_5", "value": 14.395}, {"type": "recall_at_1", "value": 30.262}, {"type": "recall_at_10", "value": 55.72200000000001}, {"type": "recall_at_100", "value": 74.97500000000001}, {"type": "recall_at_1000", "value": 87.342}, {"type": "recall_at_3", "value": 43.129}, {"type": "recall_at_5", "value": 48.336}, {"type": "map_at_1", "value": 39.951}, {"type": "map_at_10", "value": 51.248000000000005}, {"type": "map_at_100", "value": 52.188}, {"type": "map_at_1000", "value": 52.247}, {"type": "map_at_3", "value": 48.211}, {"type": "map_at_5", "value": 49.797000000000004}, {"type": "mrr_at_1", "value": 45.329}, {"type": "mrr_at_10", "value": 54.749}, {"type": "mrr_at_100", "value": 55.367999999999995}, {"type": "mrr_at_1000", "value": 55.400000000000006}, {"type": "mrr_at_3", "value": 52.382}, {"type": "mrr_at_5", "value": 53.649}, {"type": "ndcg_at_1", "value": 45.329}, {"type": "ndcg_at_10", "value": 56.847}, {"type": "ndcg_at_100", "value": 60.738}, {"type": "ndcg_at_1000", "value": 61.976}, {"type": "ndcg_at_3", "value": 51.59}, {"type": "ndcg_at_5", "value": 53.915}, {"type": "precision_at_1", "value": 45.329}, {"type": "precision_at_10", "value": 8.959}, {"type": "precision_at_100", "value": 1.187}, {"type": "precision_at_1000", "value": 0.134}, {"type": "precision_at_3", "value": 22.612}, {"type": "precision_at_5", "value": 15.273}, {"type": "recall_at_1", "value": 39.951}, {"type": "recall_at_10", "value": 70.053}, {"type": "recall_at_100", "value": 86.996}, {"type": "recall_at_1000", "value": 95.707}, {"type": "recall_at_3", "value": 56.032000000000004}, {"type": "recall_at_5", "value": 61.629999999999995}, {"type": "map_at_1", "value": 25.566}, {"type": "map_at_10", "value": 33.207}, {"type": "map_at_100", "value": 34.166000000000004}, {"type": "map_at_1000", "value": 34.245}, {"type": "map_at_3", "value": 30.94}, {"type": "map_at_5", "value": 32.01}, {"type": "mrr_at_1", "value": 27.345000000000002}, {"type": "mrr_at_10", "value": 35.193000000000005}, {"type": "mrr_at_100", "value": 35.965}, {"type": "mrr_at_1000", "value": 36.028999999999996}, {"type": "mrr_at_3", "value": 32.806000000000004}, {"type": "mrr_at_5", "value": 34.021}, {"type": "ndcg_at_1", "value": 27.345000000000002}, {"type": "ndcg_at_10", "value": 37.891999999999996}, {"type": "ndcg_at_100", "value": 42.664}, {"type": "ndcg_at_1000", "value": 44.757000000000005}, {"type": "ndcg_at_3", "value": 33.123000000000005}, {"type": "ndcg_at_5", "value": 35.035}, {"type": "precision_at_1", "value": 27.345000000000002}, {"type": "precision_at_10", "value": 5.763}, {"type": "precision_at_100", "value": 0.859}, {"type": "precision_at_1000", "value": 0.108}, {"type": "precision_at_3", "value": 13.71}, {"type": "precision_at_5", "value": 9.401}, {"type": "recall_at_1", "value": 25.566}, {"type": "recall_at_10", "value": 50.563}, {"type": "recall_at_100", "value": 72.86399999999999}, {"type": "recall_at_1000", "value": 88.68599999999999}, {"type": "recall_at_3", "value": 37.43}, {"type": "recall_at_5", "value": 41.894999999999996}, {"type": "map_at_1", "value": 16.663}, {"type": "map_at_10", "value": 23.552}, {"type": "map_at_100", "value": 24.538}, {"type": "map_at_1000", "value": 24.661}, {"type": "map_at_3", "value": 21.085}, {"type": "map_at_5", "value": 22.391}, {"type": "mrr_at_1", "value": 20.025000000000002}, {"type": "mrr_at_10", "value": 27.643}, {"type": "mrr_at_100", "value": 28.499999999999996}, {"type": "mrr_at_1000", "value": 28.582}, {"type": "mrr_at_3", "value": 25.083}, {"type": "mrr_at_5", "value": 26.544}, {"type": "ndcg_at_1", "value": 20.025000000000002}, {"type": "ndcg_at_10", "value": 28.272000000000002}, {"type": "ndcg_at_100", "value": 33.353}, {"type": "ndcg_at_1000", "value": 36.454}, {"type": "ndcg_at_3", "value": 23.579}, {"type": "ndcg_at_5", "value": 25.685000000000002}, {"type": "precision_at_1", "value": 20.025000000000002}, {"type": "precision_at_10", "value": 5.187}, {"type": "precision_at_100", "value": 0.897}, {"type": "precision_at_1000", "value": 0.13}, {"type": "precision_at_3", "value": 10.987}, {"type": "precision_at_5", "value": 8.06}, {"type": "recall_at_1", "value": 16.663}, {"type": "recall_at_10", "value": 38.808}, {"type": "recall_at_100", "value": 61.305}, {"type": "recall_at_1000", "value": 83.571}, {"type": "recall_at_3", "value": 25.907999999999998}, {"type": "recall_at_5", "value": 31.214}, {"type": "map_at_1", "value": 27.695999999999998}, {"type": "map_at_10", "value": 37.018}, {"type": "map_at_100", "value": 38.263000000000005}, {"type": "map_at_1000", "value": 38.371}, {"type": "map_at_3", "value": 34.226}, {"type": "map_at_5", "value": 35.809999999999995}, {"type": "mrr_at_1", "value": 32.916000000000004}, {"type": "mrr_at_10", "value": 42.067}, {"type": "mrr_at_100", "value": 42.925000000000004}, {"type": "mrr_at_1000", "value": 42.978}, {"type": "mrr_at_3", "value": 39.637}, {"type": "mrr_at_5", "value": 41.134}, {"type": "ndcg_at_1", "value": 32.916000000000004}, {"type": "ndcg_at_10", "value": 42.539}, {"type": "ndcg_at_100", "value": 47.873}, {"type": "ndcg_at_1000", "value": 50.08200000000001}, {"type": "ndcg_at_3", "value": 37.852999999999994}, {"type": "ndcg_at_5", "value": 40.201}, {"type": "precision_at_1", "value": 32.916000000000004}, {"type": "precision_at_10", "value": 7.5840000000000005}, {"type": "precision_at_100", "value": 1.199}, {"type": "precision_at_1000", "value": 0.155}, {"type": "precision_at_3", "value": 17.485}, {"type": "precision_at_5", "value": 12.512}, {"type": "recall_at_1", "value": 27.695999999999998}, {"type": "recall_at_10", "value": 53.638}, {"type": "recall_at_100", "value": 76.116}, {"type": "recall_at_1000", "value": 91.069}, {"type": "recall_at_3", "value": 41.13}, {"type": "recall_at_5", "value": 46.872}, {"type": "map_at_1", "value": 24.108}, {"type": "map_at_10", "value": 33.372}, {"type": "map_at_100", "value": 34.656}, {"type": "map_at_1000", "value": 34.768}, {"type": "map_at_3", "value": 30.830999999999996}, {"type": "map_at_5", "value": 32.204}, {"type": "mrr_at_1", "value": 29.110000000000003}, {"type": "mrr_at_10", "value": 37.979}, {"type": "mrr_at_100", "value": 38.933}, {"type": "mrr_at_1000", "value": 38.988}, {"type": "mrr_at_3", "value": 35.731}, {"type": "mrr_at_5", "value": 36.963}, {"type": "ndcg_at_1", "value": 29.110000000000003}, {"type": "ndcg_at_10", "value": 38.635000000000005}, {"type": "ndcg_at_100", "value": 44.324999999999996}, {"type": "ndcg_at_1000", "value": 46.747}, {"type": "ndcg_at_3", "value": 34.37}, {"type": "ndcg_at_5", "value": 36.228}, {"type": "precision_at_1", "value": 29.110000000000003}, {"type": "precision_at_10", "value": 6.963}, {"type": "precision_at_100", "value": 1.146}, {"type": "precision_at_1000", "value": 0.152}, {"type": "precision_at_3", "value": 16.400000000000002}, {"type": "precision_at_5", "value": 11.552999999999999}, {"type": "recall_at_1", "value": 24.108}, {"type": "recall_at_10", "value": 49.597}, {"type": "recall_at_100", "value": 73.88900000000001}, {"type": "recall_at_1000", "value": 90.62400000000001}, {"type": "recall_at_3", "value": 37.662}, {"type": "recall_at_5", "value": 42.565}, {"type": "map_at_1", "value": 25.00791666666667}, {"type": "map_at_10", "value": 33.287749999999996}, {"type": "map_at_100", "value": 34.41141666666667}, {"type": "map_at_1000", "value": 34.52583333333333}, {"type": "map_at_3", "value": 30.734416666666668}, {"type": "map_at_5", "value": 32.137166666666666}, {"type": "mrr_at_1", "value": 29.305666666666664}, {"type": "mrr_at_10", "value": 37.22966666666666}, {"type": "mrr_at_100", "value": 38.066583333333334}, {"type": "mrr_at_1000", "value": 38.12616666666667}, {"type": "mrr_at_3", "value": 34.92275}, {"type": "mrr_at_5", "value": 36.23333333333334}, {"type": "ndcg_at_1", "value": 29.305666666666664}, {"type": "ndcg_at_10", "value": 38.25533333333333}, {"type": "ndcg_at_100", "value": 43.25266666666666}, {"type": "ndcg_at_1000", "value": 45.63583333333334}, {"type": "ndcg_at_3", "value": 33.777166666666666}, {"type": "ndcg_at_5", "value": 35.85}, {"type": "precision_at_1", "value": 29.305666666666664}, {"type": "precision_at_10", "value": 6.596416666666667}, {"type": "precision_at_100", "value": 1.0784166666666668}, {"type": "precision_at_1000", "value": 0.14666666666666664}, {"type": "precision_at_3", "value": 15.31075}, {"type": "precision_at_5", "value": 10.830916666666667}, {"type": "recall_at_1", "value": 25.00791666666667}, {"type": "recall_at_10", "value": 49.10933333333333}, {"type": "recall_at_100", "value": 71.09216666666667}, {"type": "recall_at_1000", "value": 87.77725000000001}, {"type": "recall_at_3", "value": 36.660916666666665}, {"type": "recall_at_5", "value": 41.94149999999999}, {"type": "map_at_1", "value": 23.521}, {"type": "map_at_10", "value": 30.043}, {"type": "map_at_100", "value": 30.936000000000003}, {"type": "map_at_1000", "value": 31.022}, {"type": "map_at_3", "value": 27.926000000000002}, {"type": "map_at_5", "value": 29.076999999999998}, {"type": "mrr_at_1", "value": 26.227}, {"type": "mrr_at_10", "value": 32.822}, {"type": "mrr_at_100", "value": 33.61}, {"type": "mrr_at_1000", "value": 33.672000000000004}, {"type": "mrr_at_3", "value": 30.776999999999997}, {"type": "mrr_at_5", "value": 31.866}, {"type": "ndcg_at_1", "value": 26.227}, {"type": "ndcg_at_10", "value": 34.041}, {"type": "ndcg_at_100", "value": 38.394}, {"type": "ndcg_at_1000", "value": 40.732}, {"type": "ndcg_at_3", "value": 30.037999999999997}, {"type": "ndcg_at_5", "value": 31.845000000000002}, {"type": "precision_at_1", "value": 26.227}, {"type": "precision_at_10", "value": 5.244999999999999}, {"type": "precision_at_100", "value": 0.808}, {"type": "precision_at_1000", "value": 0.107}, {"type": "precision_at_3", "value": 12.679000000000002}, {"type": "precision_at_5", "value": 8.773}, {"type": "recall_at_1", "value": 23.521}, {"type": "recall_at_10", "value": 43.633}, {"type": "recall_at_100", "value": 63.126000000000005}, {"type": "recall_at_1000", "value": 80.765}, {"type": "recall_at_3", "value": 32.614}, {"type": "recall_at_5", "value": 37.15}, {"type": "map_at_1", "value": 16.236}, {"type": "map_at_10", "value": 22.898}, {"type": "map_at_100", "value": 23.878}, {"type": "map_at_1000", "value": 24.009}, {"type": "map_at_3", "value": 20.87}, {"type": "map_at_5", "value": 22.025}, {"type": "mrr_at_1", "value": 19.339000000000002}, {"type": "mrr_at_10", "value": 26.382}, {"type": "mrr_at_100", "value": 27.245}, {"type": "mrr_at_1000", "value": 27.33}, {"type": "mrr_at_3", "value": 24.386}, {"type": "mrr_at_5", "value": 25.496000000000002}, {"type": "ndcg_at_1", "value": 19.339000000000002}, {"type": "ndcg_at_10", "value": 27.139999999999997}, {"type": "ndcg_at_100", "value": 31.944}, {"type": "ndcg_at_1000", "value": 35.077999999999996}, {"type": "ndcg_at_3", "value": 23.424}, {"type": "ndcg_at_5", "value": 25.188}, {"type": "precision_at_1", "value": 19.339000000000002}, {"type": "precision_at_10", "value": 4.8309999999999995}, {"type": "precision_at_100", "value": 0.845}, {"type": "precision_at_1000", "value": 0.128}, {"type": "precision_at_3", "value": 10.874}, {"type": "precision_at_5", "value": 7.825}, {"type": "recall_at_1", "value": 16.236}, {"type": "recall_at_10", "value": 36.513}, {"type": "recall_at_100", "value": 57.999}, {"type": "recall_at_1000", "value": 80.512}, {"type": "recall_at_3", "value": 26.179999999999996}, {"type": "recall_at_5", "value": 30.712}, {"type": "map_at_1", "value": 24.11}, {"type": "map_at_10", "value": 31.566}, {"type": "map_at_100", "value": 32.647}, {"type": "map_at_1000", "value": 32.753}, {"type": "map_at_3", "value": 29.24}, {"type": "map_at_5", "value": 30.564999999999998}, {"type": "mrr_at_1", "value": 28.265}, {"type": "mrr_at_10", "value": 35.504000000000005}, {"type": "mrr_at_100", "value": 36.436}, {"type": "mrr_at_1000", "value": 36.503}, {"type": "mrr_at_3", "value": 33.349000000000004}, {"type": "mrr_at_5", "value": 34.622}, {"type": "ndcg_at_1", "value": 28.265}, {"type": "ndcg_at_10", "value": 36.192}, {"type": "ndcg_at_100", "value": 41.388000000000005}, {"type": "ndcg_at_1000", "value": 43.948}, {"type": "ndcg_at_3", "value": 31.959}, {"type": "ndcg_at_5", "value": 33.998}, {"type": "precision_at_1", "value": 28.265}, {"type": "precision_at_10", "value": 5.989}, {"type": "precision_at_100", "value": 0.9650000000000001}, {"type": "precision_at_1000", "value": 0.13}, {"type": "precision_at_3", "value": 14.335}, {"type": "precision_at_5", "value": 10.112}, {"type": "recall_at_1", "value": 24.11}, {"type": "recall_at_10", "value": 46.418}, {"type": "recall_at_100", "value": 69.314}, {"type": "recall_at_1000", "value": 87.397}, {"type": "recall_at_3", "value": 34.724}, {"type": "recall_at_5", "value": 39.925}, {"type": "map_at_1", "value": 22.091}, {"type": "map_at_10", "value": 29.948999999999998}, {"type": "map_at_100", "value": 31.502000000000002}, {"type": "map_at_1000", "value": 31.713}, {"type": "map_at_3", "value": 27.464}, {"type": "map_at_5", "value": 28.968}, {"type": "mrr_at_1", "value": 26.482}, {"type": "mrr_at_10", "value": 34.009}, {"type": "mrr_at_100", "value": 35.081}, {"type": "mrr_at_1000", "value": 35.138000000000005}, {"type": "mrr_at_3", "value": 31.785000000000004}, {"type": "mrr_at_5", "value": 33.178999999999995}, {"type": "ndcg_at_1", "value": 26.482}, {"type": "ndcg_at_10", "value": 35.008}, {"type": "ndcg_at_100", "value": 41.272999999999996}, {"type": "ndcg_at_1000", "value": 43.972}, {"type": "ndcg_at_3", "value": 30.804}, {"type": "ndcg_at_5", "value": 33.046}, {"type": "precision_at_1", "value": 26.482}, {"type": "precision_at_10", "value": 6.462}, {"type": "precision_at_100", "value": 1.431}, {"type": "precision_at_1000", "value": 0.22899999999999998}, {"type": "precision_at_3", "value": 14.360999999999999}, {"type": "precision_at_5", "value": 10.474}, {"type": "recall_at_1", "value": 22.091}, {"type": "recall_at_10", "value": 45.125}, {"type": "recall_at_100", "value": 72.313}, {"type": "recall_at_1000", "value": 89.503}, {"type": "recall_at_3", "value": 33.158}, {"type": "recall_at_5", "value": 39.086999999999996}, {"type": "map_at_1", "value": 19.883}, {"type": "map_at_10", "value": 26.951000000000004}, {"type": "map_at_100", "value": 27.927999999999997}, {"type": "map_at_1000", "value": 28.022000000000002}, {"type": "map_at_3", "value": 24.616}, {"type": "map_at_5", "value": 25.917}, {"type": "mrr_at_1", "value": 21.996}, {"type": "mrr_at_10", "value": 29.221000000000004}, {"type": "mrr_at_100", "value": 30.024}, {"type": "mrr_at_1000", "value": 30.095}, {"type": "mrr_at_3", "value": 26.833000000000002}, {"type": "mrr_at_5", "value": 28.155}, {"type": "ndcg_at_1", "value": 21.996}, {"type": "ndcg_at_10", "value": 31.421}, {"type": "ndcg_at_100", "value": 36.237}, {"type": "ndcg_at_1000", "value": 38.744}, {"type": "ndcg_at_3", "value": 26.671}, {"type": "ndcg_at_5", "value": 28.907}, {"type": "precision_at_1", "value": 21.996}, {"type": "precision_at_10", "value": 5.009}, {"type": "precision_at_100", "value": 0.799}, {"type": "precision_at_1000", "value": 0.11199999999999999}, {"type": "precision_at_3", "value": 11.275}, {"type": "precision_at_5", "value": 8.059}, {"type": "recall_at_1", "value": 19.883}, {"type": "recall_at_10", "value": 43.132999999999996}, {"type": "recall_at_100", "value": 65.654}, {"type": "recall_at_1000", "value": 84.492}, {"type": "recall_at_3", "value": 30.209000000000003}, {"type": "recall_at_5", "value": 35.616}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ClimateFEVER", "type": "climate-fever", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 17.756}, {"type": "map_at_10", "value": 30.378}, {"type": "map_at_100", "value": 32.537}, {"type": "map_at_1000", "value": 32.717}, {"type": "map_at_3", "value": 25.599}, {"type": "map_at_5", "value": 28.372999999999998}, {"type": "mrr_at_1", "value": 41.303}, {"type": "mrr_at_10", "value": 53.483999999999995}, {"type": "mrr_at_100", "value": 54.106}, {"type": "mrr_at_1000", "value": 54.127}, {"type": "mrr_at_3", "value": 50.315}, {"type": "mrr_at_5", "value": 52.396}, {"type": "ndcg_at_1", "value": 41.303}, {"type": "ndcg_at_10", "value": 40.503}, {"type": "ndcg_at_100", "value": 47.821000000000005}, {"type": "ndcg_at_1000", "value": 50.788}, {"type": "ndcg_at_3", "value": 34.364}, {"type": "ndcg_at_5", "value": 36.818}, {"type": "precision_at_1", "value": 41.303}, {"type": "precision_at_10", "value": 12.463000000000001}, {"type": "precision_at_100", "value": 2.037}, {"type": "precision_at_1000", "value": 0.26}, {"type": "precision_at_3", "value": 25.798}, {"type": "precision_at_5", "value": 19.896}, {"type": "recall_at_1", "value": 17.756}, {"type": "recall_at_10", "value": 46.102}, {"type": "recall_at_100", "value": 70.819}, {"type": "recall_at_1000", "value": 87.21799999999999}, {"type": "recall_at_3", "value": 30.646}, {"type": "recall_at_5", "value": 38.022}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB DBPedia", "type": "dbpedia-entity", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 9.033}, {"type": "map_at_10", "value": 20.584}, {"type": "map_at_100", "value": 29.518}, {"type": "map_at_1000", "value": 31.186000000000003}, {"type": "map_at_3", "value": 14.468}, {"type": "map_at_5", "value": 17.177}, {"type": "mrr_at_1", "value": 69.75}, {"type": "mrr_at_10", "value": 77.025}, {"type": "mrr_at_100", "value": 77.36699999999999}, {"type": "mrr_at_1000", "value": 77.373}, {"type": "mrr_at_3", "value": 75.583}, {"type": "mrr_at_5", "value": 76.396}, {"type": "ndcg_at_1", "value": 58.5}, {"type": "ndcg_at_10", "value": 45.033}, {"type": "ndcg_at_100", "value": 49.071}, {"type": "ndcg_at_1000", "value": 56.056}, {"type": "ndcg_at_3", "value": 49.936}, {"type": "ndcg_at_5", "value": 47.471999999999994}, {"type": "precision_at_1", "value": 69.75}, {"type": "precision_at_10", "value": 35.775}, {"type": "precision_at_100", "value": 11.594999999999999}, {"type": "precision_at_1000", "value": 2.062}, {"type": "precision_at_3", "value": 52.5}, {"type": "precision_at_5", "value": 45.300000000000004}, {"type": "recall_at_1", "value": 9.033}, {"type": "recall_at_10", "value": 26.596999999999998}, {"type": "recall_at_100", "value": 54.607000000000006}, {"type": "recall_at_1000", "value": 76.961}, {"type": "recall_at_3", "value": 15.754999999999999}, {"type": "recall_at_5", "value": 20.033}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB EmotionClassification", "type": "mteb/emotion", "config": "default", "split": "test", "revision": "4f58c6b202a23cf9a4da393831edf4f9183cad37"}, "metrics": [{"type": "accuracy", "value": 48.345000000000006}, {"type": "f1", "value": 43.4514918068706}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FEVER", "type": "fever", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 71.29100000000001}, {"type": "map_at_10", "value": 81.059}, {"type": "map_at_100", "value": 81.341}, {"type": "map_at_1000", "value": 81.355}, {"type": "map_at_3", "value": 79.74799999999999}, {"type": "map_at_5", "value": 80.612}, {"type": "mrr_at_1", "value": 76.40299999999999}, {"type": "mrr_at_10", "value": 84.615}, {"type": "mrr_at_100", "value": 84.745}, {"type": "mrr_at_1000", "value": 84.748}, {"type": "mrr_at_3", "value": 83.776}, {"type": "mrr_at_5", "value": 84.343}, {"type": "ndcg_at_1", "value": 76.40299999999999}, {"type": "ndcg_at_10", "value": 84.981}, {"type": "ndcg_at_100", "value": 86.00999999999999}, {"type": "ndcg_at_1000", "value": 86.252}, {"type": "ndcg_at_3", "value": 82.97}, {"type": "ndcg_at_5", "value": 84.152}, {"type": "precision_at_1", "value": 76.40299999999999}, {"type": "precision_at_10", "value": 10.446}, {"type": "precision_at_100", "value": 1.1199999999999999}, {"type": "precision_at_1000", "value": 0.116}, {"type": "precision_at_3", "value": 32.147999999999996}, {"type": "precision_at_5", "value": 20.135}, {"type": "recall_at_1", "value": 71.29100000000001}, {"type": "recall_at_10", "value": 93.232}, {"type": "recall_at_100", "value": 97.363}, {"type": "recall_at_1000", "value": 98.905}, {"type": "recall_at_3", "value": 87.893}, {"type": "recall_at_5", "value": 90.804}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FiQA2018", "type": "fiqa", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 18.667}, {"type": "map_at_10", "value": 30.853}, {"type": "map_at_100", "value": 32.494}, {"type": "map_at_1000", "value": 32.677}, {"type": "map_at_3", "value": 26.91}, {"type": "map_at_5", "value": 29.099000000000004}, {"type": "mrr_at_1", "value": 37.191}, {"type": "mrr_at_10", "value": 46.171}, {"type": "mrr_at_100", "value": 47.056}, {"type": "mrr_at_1000", "value": 47.099000000000004}, {"type": "mrr_at_3", "value": 44.059}, {"type": "mrr_at_5", "value": 45.147}, {"type": "ndcg_at_1", "value": 37.191}, {"type": "ndcg_at_10", "value": 38.437}, {"type": "ndcg_at_100", "value": 44.62}, {"type": "ndcg_at_1000", "value": 47.795}, {"type": "ndcg_at_3", "value": 35.003}, {"type": "ndcg_at_5", "value": 36.006}, {"type": "precision_at_1", "value": 37.191}, {"type": "precision_at_10", "value": 10.586}, {"type": "precision_at_100", "value": 1.688}, {"type": "precision_at_1000", "value": 0.22699999999999998}, {"type": "precision_at_3", "value": 23.302}, {"type": "precision_at_5", "value": 17.006}, {"type": "recall_at_1", "value": 18.667}, {"type": "recall_at_10", "value": 45.367000000000004}, {"type": "recall_at_100", "value": 68.207}, {"type": "recall_at_1000", "value": 87.072}, {"type": "recall_at_3", "value": 32.129000000000005}, {"type": "recall_at_5", "value": 37.719}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB HotpotQA", "type": "hotpotqa", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 39.494}, {"type": "map_at_10", "value": 66.223}, {"type": "map_at_100", "value": 67.062}, {"type": "map_at_1000", "value": 67.11500000000001}, {"type": "map_at_3", "value": 62.867}, {"type": "map_at_5", "value": 64.994}, {"type": "mrr_at_1", "value": 78.987}, {"type": "mrr_at_10", "value": 84.585}, {"type": "mrr_at_100", "value": 84.773}, {"type": "mrr_at_1000", "value": 84.77900000000001}, {"type": "mrr_at_3", "value": 83.592}, {"type": "mrr_at_5", "value": 84.235}, {"type": "ndcg_at_1", "value": 78.987}, {"type": "ndcg_at_10", "value": 73.64}, {"type": "ndcg_at_100", "value": 76.519}, {"type": "ndcg_at_1000", "value": 77.51}, {"type": "ndcg_at_3", "value": 68.893}, {"type": "ndcg_at_5", "value": 71.585}, {"type": "precision_at_1", "value": 78.987}, {"type": "precision_at_10", "value": 15.529000000000002}, {"type": "precision_at_100", "value": 1.7770000000000001}, {"type": "precision_at_1000", "value": 0.191}, {"type": "precision_at_3", "value": 44.808}, {"type": "precision_at_5", "value": 29.006999999999998}, {"type": "recall_at_1", "value": 39.494}, {"type": "recall_at_10", "value": 77.643}, {"type": "recall_at_100", "value": 88.825}, {"type": "recall_at_1000", "value": 95.321}, {"type": "recall_at_3", "value": 67.211}, {"type": "recall_at_5", "value": 72.519}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ImdbClassification", "type": "mteb/imdb", "config": "default", "split": "test", "revision": "3d86128a09e091d6018b6d26cad27f2739fc2db7"}, "metrics": [{"type": "accuracy", "value": 85.55959999999999}, {"type": "ap", "value": 80.7246500384617}, {"type": "f1", "value": 85.52336485065454}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MSMARCO", "type": "msmarco", "config": "default", "split": "dev", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 23.631}, {"type": "map_at_10", "value": 36.264}, {"type": "map_at_100", "value": 37.428}, {"type": "map_at_1000", "value": 37.472}, {"type": "map_at_3", "value": 32.537}, {"type": "map_at_5", "value": 34.746}, {"type": "mrr_at_1", "value": 24.312}, {"type": "mrr_at_10", "value": 36.858000000000004}, {"type": "mrr_at_100", "value": 37.966}, {"type": "mrr_at_1000", "value": 38.004}, {"type": "mrr_at_3", "value": 33.188}, {"type": "mrr_at_5", "value": 35.367}, {"type": "ndcg_at_1", "value": 24.312}, {"type": "ndcg_at_10", "value": 43.126999999999995}, {"type": "ndcg_at_100", "value": 48.642}, {"type": "ndcg_at_1000", "value": 49.741}, {"type": "ndcg_at_3", "value": 35.589}, {"type": "ndcg_at_5", "value": 39.515}, {"type": "precision_at_1", "value": 24.312}, {"type": "precision_at_10", "value": 6.699}, {"type": "precision_at_100", "value": 0.9450000000000001}, {"type": "precision_at_1000", "value": 0.104}, {"type": "precision_at_3", "value": 15.153}, {"type": "precision_at_5", "value": 11.065999999999999}, {"type": "recall_at_1", "value": 23.631}, {"type": "recall_at_10", "value": 64.145}, {"type": "recall_at_100", "value": 89.41}, {"type": "recall_at_1000", "value": 97.83500000000001}, {"type": "recall_at_3", "value": 43.769000000000005}, {"type": "recall_at_5", "value": 53.169}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPDomainClassification (en)", "type": "mteb/mtop_domain", "config": "en", "split": "test", "revision": "d80d48c1eb48d3562165c59d59d0034df9fff0bf"}, "metrics": [{"type": "accuracy", "value": 93.4108527131783}, {"type": "f1", "value": 93.1415880261038}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPIntentClassification (en)", "type": "mteb/mtop_intent", "config": "en", "split": "test", "revision": "ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba"}, "metrics": [{"type": "accuracy", "value": 77.24806201550388}, {"type": "f1", "value": 60.531916308197175}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (en)", "type": "mteb/amazon_massive_intent", "config": "en", "split": "test", "revision": "31efe3c427b0bae9c22cbb560b8f15491cc6bed7"}, "metrics": [{"type": "accuracy", "value": 73.71553463349024}, {"type": "f1", "value": 71.70753174900791}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (en)", "type": "mteb/amazon_massive_scenario", "config": "en", "split": "test", "revision": "7d571f92784cd94a019292a1f45445077d0ef634"}, "metrics": [{"type": "accuracy", "value": 77.79757901815736}, {"type": "f1", "value": 77.83719850433258}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringP2P", "type": "mteb/medrxiv-clustering-p2p", "config": "default", "split": "test", "revision": "e7a26af6f3ae46b30dde8737f02c07b1505bcc73"}, "metrics": [{"type": "v_measure", "value": 33.74193296622113}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringS2S", "type": "mteb/medrxiv-clustering-s2s", "config": "default", "split": "test", "revision": "35191c8c0dca72d8ff3efcd72aa802307d469663"}, "metrics": [{"type": "v_measure", "value": 30.64257594108566}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB MindSmallReranking", "type": "mteb/mind_small", "config": "default", "split": "test", "revision": "3bdac13927fdc888b903db93b2ffdbd90b295a69"}, "metrics": [{"type": "map", "value": 30.811018518883625}, {"type": "mrr", "value": 31.910376577445003}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NFCorpus", "type": "nfcorpus", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 5.409}, {"type": "map_at_10", "value": 13.093}, {"type": "map_at_100", "value": 16.256999999999998}, {"type": "map_at_1000", "value": 17.617}, {"type": "map_at_3", "value": 9.555}, {"type": "map_at_5", "value": 11.428}, {"type": "mrr_at_1", "value": 45.201}, {"type": "mrr_at_10", "value": 54.179}, {"type": "mrr_at_100", "value": 54.812000000000005}, {"type": "mrr_at_1000", "value": 54.840999999999994}, {"type": "mrr_at_3", "value": 51.909000000000006}, {"type": "mrr_at_5", "value": 53.519000000000005}, {"type": "ndcg_at_1", "value": 43.189}, {"type": "ndcg_at_10", "value": 35.028}, {"type": "ndcg_at_100", "value": 31.226}, {"type": "ndcg_at_1000", "value": 39.678000000000004}, {"type": "ndcg_at_3", "value": 40.596}, {"type": "ndcg_at_5", "value": 38.75}, {"type": "precision_at_1", "value": 44.582}, {"type": "precision_at_10", "value": 25.974999999999998}, {"type": "precision_at_100", "value": 7.793}, {"type": "precision_at_1000", "value": 2.036}, {"type": "precision_at_3", "value": 38.493}, {"type": "precision_at_5", "value": 33.994}, {"type": "recall_at_1", "value": 5.409}, {"type": "recall_at_10", "value": 16.875999999999998}, {"type": "recall_at_100", "value": 30.316}, {"type": "recall_at_1000", "value": 60.891}, {"type": "recall_at_3", "value": 10.688}, {"type": "recall_at_5", "value": 13.832}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NQ", "type": "nq", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 36.375}, {"type": "map_at_10", "value": 51.991}, {"type": "map_at_100", "value": 52.91400000000001}, {"type": "map_at_1000", "value": 52.93600000000001}, {"type": "map_at_3", "value": 48.014}, {"type": "map_at_5", "value": 50.381}, {"type": "mrr_at_1", "value": 40.759}, {"type": "mrr_at_10", "value": 54.617000000000004}, {"type": "mrr_at_100", "value": 55.301}, {"type": "mrr_at_1000", "value": 55.315000000000005}, {"type": "mrr_at_3", "value": 51.516}, {"type": "mrr_at_5", "value": 53.435}, {"type": "ndcg_at_1", "value": 40.759}, {"type": "ndcg_at_10", "value": 59.384}, {"type": "ndcg_at_100", "value": 63.157}, {"type": "ndcg_at_1000", "value": 63.654999999999994}, {"type": "ndcg_at_3", "value": 52.114000000000004}, {"type": "ndcg_at_5", "value": 55.986000000000004}, {"type": "precision_at_1", "value": 40.759}, {"type": "precision_at_10", "value": 9.411999999999999}, {"type": "precision_at_100", "value": 1.153}, {"type": "precision_at_1000", "value": 0.12}, {"type": "precision_at_3", "value": 23.329}, {"type": "precision_at_5", "value": 16.256999999999998}, {"type": "recall_at_1", "value": 36.375}, {"type": "recall_at_10", "value": 79.053}, {"type": "recall_at_100", "value": 95.167}, {"type": "recall_at_1000", "value": 98.82}, {"type": "recall_at_3", "value": 60.475}, {"type": "recall_at_5", "value": 69.327}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB QuoraRetrieval", "type": "quora", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 70.256}, {"type": "map_at_10", "value": 83.8}, {"type": "map_at_100", "value": 84.425}, {"type": "map_at_1000", "value": 84.444}, {"type": "map_at_3", "value": 80.906}, {"type": "map_at_5", "value": 82.717}, {"type": "mrr_at_1", "value": 80.97999999999999}, {"type": "mrr_at_10", "value": 87.161}, {"type": "mrr_at_100", "value": 87.262}, {"type": "mrr_at_1000", "value": 87.263}, {"type": "mrr_at_3", "value": 86.175}, {"type": "mrr_at_5", "value": 86.848}, {"type": "ndcg_at_1", "value": 80.97999999999999}, {"type": "ndcg_at_10", "value": 87.697}, {"type": "ndcg_at_100", "value": 88.959}, {"type": "ndcg_at_1000", "value": 89.09899999999999}, {"type": "ndcg_at_3", "value": 84.83800000000001}, {"type": "ndcg_at_5", "value": 86.401}, {"type": "precision_at_1", "value": 80.97999999999999}, {"type": "precision_at_10", "value": 13.261000000000001}, {"type": "precision_at_100", "value": 1.5150000000000001}, {"type": "precision_at_1000", "value": 0.156}, {"type": "precision_at_3", "value": 37.01}, {"type": "precision_at_5", "value": 24.298000000000002}, {"type": "recall_at_1", "value": 70.256}, {"type": "recall_at_10", "value": 94.935}, {"type": "recall_at_100", "value": 99.274}, {"type": "recall_at_1000", "value": 99.928}, {"type": "recall_at_3", "value": 86.602}, {"type": "recall_at_5", "value": 91.133}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClustering", "type": "mteb/reddit-clustering", "config": "default", "split": "test", "revision": "24640382cdbf8abc73003fb0fa6d111a705499eb"}, "metrics": [{"type": "v_measure", "value": 56.322692497613104}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClusteringP2P", "type": "mteb/reddit-clustering-p2p", "config": "default", "split": "test", "revision": "282350215ef01743dc01b456c7f5241fa8937f16"}, "metrics": [{"type": "v_measure", "value": 61.895813503775074}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SCIDOCS", "type": "scidocs", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 4.338}, {"type": "map_at_10", "value": 10.767}, {"type": "map_at_100", "value": 12.537999999999998}, {"type": "map_at_1000", "value": 12.803999999999998}, {"type": "map_at_3", "value": 7.788}, {"type": "map_at_5", "value": 9.302000000000001}, {"type": "mrr_at_1", "value": 21.4}, {"type": "mrr_at_10", "value": 31.637999999999998}, {"type": "mrr_at_100", "value": 32.688}, {"type": "mrr_at_1000", "value": 32.756}, {"type": "mrr_at_3", "value": 28.433000000000003}, {"type": "mrr_at_5", "value": 30.178}, {"type": "ndcg_at_1", "value": 21.4}, {"type": "ndcg_at_10", "value": 18.293}, {"type": "ndcg_at_100", "value": 25.274}, {"type": "ndcg_at_1000", "value": 30.284}, {"type": "ndcg_at_3", "value": 17.391000000000002}, {"type": "ndcg_at_5", "value": 15.146999999999998}, {"type": "precision_at_1", "value": 21.4}, {"type": "precision_at_10", "value": 9.48}, {"type": "precision_at_100", "value": 1.949}, {"type": "precision_at_1000", "value": 0.316}, {"type": "precision_at_3", "value": 16.167}, {"type": "precision_at_5", "value": 13.22}, {"type": "recall_at_1", "value": 4.338}, {"type": "recall_at_10", "value": 19.213}, {"type": "recall_at_100", "value": 39.562999999999995}, {"type": "recall_at_1000", "value": 64.08}, {"type": "recall_at_3", "value": 9.828000000000001}, {"type": "recall_at_5", "value": 13.383000000000001}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB SICK-R", "type": "mteb/sickr-sts", "config": "default", "split": "test", "revision": "a6ea5a8cab320b040a23452cc28066d9beae2cee"}, "metrics": [{"type": "cos_sim_pearson", "value": 82.42568163642142}, {"type": "cos_sim_spearman", "value": 78.5797159641342}, {"type": "euclidean_pearson", "value": 80.22151260811604}, {"type": "euclidean_spearman", "value": 78.5797151953878}, {"type": "manhattan_pearson", "value": 80.21224215864788}, {"type": "manhattan_spearman", "value": 78.55641478381344}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS12", "type": "mteb/sts12-sts", "config": "default", "split": "test", "revision": "a0d554a64d88156834ff5ae9920b964011b16384"}, "metrics": [{"type": "cos_sim_pearson", "value": 85.44020710812569}, {"type": "cos_sim_spearman", "value": 78.91631735081286}, {"type": "euclidean_pearson", "value": 81.64188964182102}, {"type": "euclidean_spearman", "value": 78.91633286881678}, {"type": "manhattan_pearson", "value": 81.69294748512496}, {"type": "manhattan_spearman", "value": 78.93438558002656}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS13", "type": "mteb/sts13-sts", "config": "default", "split": "test", "revision": "7e90230a92c190f1bf69ae9002b8cea547a64cca"}, "metrics": [{"type": "cos_sim_pearson", "value": 84.27165426412311}, {"type": "cos_sim_spearman", "value": 85.40429140249618}, {"type": "euclidean_pearson", "value": 84.7509580724893}, {"type": "euclidean_spearman", "value": 85.40429140249618}, {"type": "manhattan_pearson", "value": 84.76488289321308}, {"type": "manhattan_spearman", "value": 85.4256793698708}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS14", "type": "mteb/sts14-sts", "config": "default", "split": "test", "revision": "6031580fec1f6af667f0bd2da0a551cf4f0b2375"}, "metrics": [{"type": "cos_sim_pearson", "value": 83.138851760732}, {"type": "cos_sim_spearman", "value": 81.64101363896586}, {"type": "euclidean_pearson", "value": 82.55165038934942}, {"type": "euclidean_spearman", "value": 81.64105257080502}, {"type": "manhattan_pearson", "value": 82.52802949883335}, {"type": "manhattan_spearman", "value": 81.61255430718158}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS15", "type": "mteb/sts15-sts", "config": "default", "split": "test", "revision": "ae752c7c21bf194d8b67fd573edf7ae58183cbe3"}, "metrics": [{"type": "cos_sim_pearson", "value": 86.0654695484029}, {"type": "cos_sim_spearman", "value": 87.20408521902229}, {"type": "euclidean_pearson", "value": 86.8110651362115}, {"type": "euclidean_spearman", "value": 87.20408521902229}, {"type": "manhattan_pearson", "value": 86.77984656478691}, {"type": "manhattan_spearman", "value": 87.1719947099227}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS16", "type": "mteb/sts16-sts", "config": "default", "split": "test", "revision": "4d8694f8f0e0100860b497b999b3dbed754a0513"}, "metrics": [{"type": "cos_sim_pearson", "value": 83.77823915496512}, {"type": "cos_sim_spearman", "value": 85.43566325729779}, {"type": "euclidean_pearson", "value": 84.5396956658821}, {"type": "euclidean_spearman", "value": 85.43566325729779}, {"type": "manhattan_pearson", "value": 84.5665398848169}, {"type": "manhattan_spearman", "value": 85.44375870303232}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS17 (en-en)", "type": "mteb/sts17-crosslingual-sts", "config": "en-en", "split": "test", "revision": "af5e6fb845001ecf41f4c1e033ce921939a2a68d"}, "metrics": [{"type": "cos_sim_pearson", "value": 87.20030208471798}, {"type": "cos_sim_spearman", "value": 87.20485505076539}, {"type": "euclidean_pearson", "value": 88.10588324368722}, {"type": "euclidean_spearman", "value": 87.20485505076539}, {"type": "manhattan_pearson", "value": 87.92324770415183}, {"type": "manhattan_spearman", "value": 87.0571314561877}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (en)", "type": "mteb/sts22-crosslingual-sts", "config": "en", "split": "test", "revision": "6d1ba47164174a496b7fa5d3569dae26a6813b80"}, "metrics": [{"type": "cos_sim_pearson", "value": 63.06093161604453}, {"type": "cos_sim_spearman", "value": 64.2163140357722}, {"type": "euclidean_pearson", "value": 65.27589680994006}, {"type": "euclidean_spearman", "value": 64.2163140357722}, {"type": "manhattan_pearson", "value": 65.45904383711101}, {"type": "manhattan_spearman", "value": 64.55404716679305}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSBenchmark", "type": "mteb/stsbenchmark-sts", "config": "default", "split": "test", "revision": "b0fddb56ed78048fa8b90373c8a3cfc37b684831"}, "metrics": [{"type": "cos_sim_pearson", "value": 84.32976164578706}, {"type": "cos_sim_spearman", "value": 85.54302197678368}, {"type": "euclidean_pearson", "value": 85.26307149193056}, {"type": "euclidean_spearman", "value": 85.54302197678368}, {"type": "manhattan_pearson", "value": 85.26647282029371}, {"type": "manhattan_spearman", "value": 85.5316135265568}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB SciDocsRR", "type": "mteb/scidocs-reranking", "config": "default", "split": "test", "revision": "d3c5e1fc0b855ab6097bf1cda04dd73947d7caab"}, "metrics": [{"type": "map", "value": 81.44675968318754}, {"type": "mrr", "value": 94.92741826075158}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SciFact", "type": "scifact", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 56.34400000000001}, {"type": "map_at_10", "value": 65.927}, {"type": "map_at_100", "value": 66.431}, {"type": "map_at_1000", "value": 66.461}, {"type": "map_at_3", "value": 63.529}, {"type": "map_at_5", "value": 64.818}, {"type": "mrr_at_1", "value": 59.333000000000006}, {"type": "mrr_at_10", "value": 67.54599999999999}, {"type": "mrr_at_100", "value": 67.892}, {"type": "mrr_at_1000", "value": 67.917}, {"type": "mrr_at_3", "value": 65.778}, {"type": "mrr_at_5", "value": 66.794}, {"type": "ndcg_at_1", "value": 59.333000000000006}, {"type": "ndcg_at_10", "value": 70.5}, {"type": "ndcg_at_100", "value": 72.688}, {"type": "ndcg_at_1000", "value": 73.483}, {"type": "ndcg_at_3", "value": 66.338}, {"type": "ndcg_at_5", "value": 68.265}, {"type": "precision_at_1", "value": 59.333000000000006}, {"type": "precision_at_10", "value": 9.3}, {"type": "precision_at_100", "value": 1.053}, {"type": "precision_at_1000", "value": 0.11199999999999999}, {"type": "precision_at_3", "value": 25.889}, {"type": "precision_at_5", "value": 16.866999999999997}, {"type": "recall_at_1", "value": 56.34400000000001}, {"type": "recall_at_10", "value": 82.789}, {"type": "recall_at_100", "value": 92.767}, {"type": "recall_at_1000", "value": 99}, {"type": "recall_at_3", "value": 71.64399999999999}, {"type": "recall_at_5", "value": 76.322}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB SprintDuplicateQuestions", "type": "mteb/sprintduplicatequestions-pairclassification", "config": "default", "split": "test", "revision": "d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46"}, "metrics": [{"type": "cos_sim_accuracy", "value": 99.75742574257426}, {"type": "cos_sim_ap", "value": 93.52081548447406}, {"type": "cos_sim_f1", "value": 87.33850129198966}, {"type": "cos_sim_precision", "value": 90.37433155080214}, {"type": "cos_sim_recall", "value": 84.5}, {"type": "dot_accuracy", "value": 99.75742574257426}, {"type": "dot_ap", "value": 93.52081548447406}, {"type": "dot_f1", "value": 87.33850129198966}, {"type": "dot_precision", "value": 90.37433155080214}, {"type": "dot_recall", "value": 84.5}, {"type": "euclidean_accuracy", "value": 99.75742574257426}, {"type": "euclidean_ap", "value": 93.52081548447406}, {"type": "euclidean_f1", "value": 87.33850129198966}, {"type": "euclidean_precision", "value": 90.37433155080214}, {"type": "euclidean_recall", "value": 84.5}, {"type": "manhattan_accuracy", "value": 99.75841584158415}, {"type": "manhattan_ap", "value": 93.4975678585854}, {"type": "manhattan_f1", "value": 87.26708074534162}, {"type": "manhattan_precision", "value": 90.45064377682404}, {"type": "manhattan_recall", "value": 84.3}, {"type": "max_accuracy", "value": 99.75841584158415}, {"type": "max_ap", "value": 93.52081548447406}, {"type": "max_f1", "value": 87.33850129198966}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClustering", "type": "mteb/stackexchange-clustering", "config": "default", "split": "test", "revision": "6cbc1f7b2bc0622f2e39d2c77fa502909748c259"}, "metrics": [{"type": "v_measure", "value": 64.31437036686651}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClusteringP2P", "type": "mteb/stackexchange-clustering-p2p", "config": "default", "split": "test", "revision": "815ca46b2622cec33ccafc3735d572c266efdb44"}, "metrics": [{"type": "v_measure", "value": 33.25569319007206}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB StackOverflowDupQuestions", "type": "mteb/stackoverflowdupquestions-reranking", "config": "default", "split": "test", "revision": "e185fbe320c72810689fc5848eb6114e1ef5ec69"}, "metrics": [{"type": "map", "value": 49.90474939720706}, {"type": "mrr", "value": 50.568115503777264}]}, {"task": {"type": "Summarization"}, "dataset": {"name": "MTEB SummEval", "type": "mteb/summeval", "config": "default", "split": "test", "revision": "cda12ad7615edc362dbf25a00fdd61d3b1eaf93c"}, "metrics": [{"type": "cos_sim_pearson", "value": 29.866828641244712}, {"type": "cos_sim_spearman", "value": 30.077555055873866}, {"type": "dot_pearson", "value": 29.866832988572266}, {"type": "dot_spearman", "value": 30.077555055873866}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB TRECCOVID", "type": "trec-covid", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 0.232}, {"type": "map_at_10", "value": 2.094}, {"type": "map_at_100", "value": 11.971}, {"type": "map_at_1000", "value": 28.158}, {"type": "map_at_3", "value": 0.688}, {"type": "map_at_5", "value": 1.114}, {"type": "mrr_at_1", "value": 88}, {"type": "mrr_at_10", "value": 93.4}, {"type": "mrr_at_100", "value": 93.4}, {"type": "mrr_at_1000", "value": 93.4}, {"type": "mrr_at_3", "value": 93}, {"type": "mrr_at_5", "value": 93.4}, {"type": "ndcg_at_1", "value": 84}, {"type": "ndcg_at_10", "value": 79.923}, {"type": "ndcg_at_100", "value": 61.17}, {"type": "ndcg_at_1000", "value": 53.03}, {"type": "ndcg_at_3", "value": 84.592}, {"type": "ndcg_at_5", "value": 82.821}, {"type": "precision_at_1", "value": 88}, {"type": "precision_at_10", "value": 85}, {"type": "precision_at_100", "value": 63.019999999999996}, {"type": "precision_at_1000", "value": 23.554}, {"type": "precision_at_3", "value": 89.333}, {"type": "precision_at_5", "value": 87.2}, {"type": "recall_at_1", "value": 0.232}, {"type": "recall_at_10", "value": 2.255}, {"type": "recall_at_100", "value": 14.823}, {"type": "recall_at_1000", "value": 49.456}, {"type": "recall_at_3", "value": 0.718}, {"type": "recall_at_5", "value": 1.175}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB Touche2020", "type": "webis-touche2020", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 2.547}, {"type": "map_at_10", "value": 11.375}, {"type": "map_at_100", "value": 18.194}, {"type": "map_at_1000", "value": 19.749}, {"type": "map_at_3", "value": 5.825}, {"type": "map_at_5", "value": 8.581}, {"type": "mrr_at_1", "value": 32.653}, {"type": "mrr_at_10", "value": 51.32}, {"type": "mrr_at_100", "value": 51.747}, {"type": "mrr_at_1000", "value": 51.747}, {"type": "mrr_at_3", "value": 47.278999999999996}, {"type": "mrr_at_5", "value": 48.605}, {"type": "ndcg_at_1", "value": 29.592000000000002}, {"type": "ndcg_at_10", "value": 28.151}, {"type": "ndcg_at_100", "value": 39.438}, {"type": "ndcg_at_1000", "value": 50.769}, {"type": "ndcg_at_3", "value": 30.758999999999997}, {"type": "ndcg_at_5", "value": 30.366}, {"type": "precision_at_1", "value": 32.653}, {"type": "precision_at_10", "value": 25.714}, {"type": "precision_at_100", "value": 8.041}, {"type": "precision_at_1000", "value": 1.555}, {"type": "precision_at_3", "value": 33.333}, {"type": "precision_at_5", "value": 31.837}, {"type": "recall_at_1", "value": 2.547}, {"type": "recall_at_10", "value": 18.19}, {"type": "recall_at_100", "value": 49.538}, {"type": "recall_at_1000", "value": 83.86}, {"type": "recall_at_3", "value": 7.329}, {"type": "recall_at_5", "value": 11.532}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ToxicConversationsClassification", "type": "mteb/toxic_conversations_50k", "config": "default", "split": "test", "revision": "d7c0de2777da35d6aae2200a62c6e0e5af397c4c"}, "metrics": [{"type": "accuracy", "value": 71.4952}, {"type": "ap", "value": 14.793362635531409}, {"type": "f1", "value": 55.204635551516915}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB TweetSentimentExtractionClassification", "type": "mteb/tweet_sentiment_extraction", "config": "default", "split": "test", "revision": "d604517c81ca91fe16a244d1248fc021f9ecee7a"}, "metrics": [{"type": "accuracy", "value": 61.5365025466893}, {"type": "f1", "value": 61.81742556334845}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB TwentyNewsgroupsClustering", "type": "mteb/twentynewsgroups-clustering", "config": "default", "split": "test", "revision": "6125ec4e24fa026cec8a478383ee943acfbd5449"}, "metrics": [{"type": "v_measure", "value": 49.05531070301185}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterSemEval2015", "type": "mteb/twittersemeval2015-pairclassification", "config": "default", "split": "test", "revision": "70970daeab8776df92f5ea462b6173c0b46fd2d1"}, "metrics": [{"type": "cos_sim_accuracy", "value": 86.51725576682364}, {"type": "cos_sim_ap", "value": 75.2292304265163}, {"type": "cos_sim_f1", "value": 69.54022988505749}, {"type": "cos_sim_precision", "value": 63.65629110039457}, {"type": "cos_sim_recall", "value": 76.62269129287598}, {"type": "dot_accuracy", "value": 86.51725576682364}, {"type": "dot_ap", "value": 75.22922386081054}, {"type": "dot_f1", "value": 69.54022988505749}, {"type": "dot_precision", "value": 63.65629110039457}, {"type": "dot_recall", "value": 76.62269129287598}, {"type": "euclidean_accuracy", "value": 86.51725576682364}, {"type": "euclidean_ap", "value": 75.22925730473472}, {"type": "euclidean_f1", "value": 69.54022988505749}, {"type": "euclidean_precision", "value": 63.65629110039457}, {"type": "euclidean_recall", "value": 76.62269129287598}, {"type": "manhattan_accuracy", "value": 86.52321630804077}, {"type": "manhattan_ap", "value": 75.20608115037336}, {"type": "manhattan_f1", "value": 69.60000000000001}, {"type": "manhattan_precision", "value": 64.37219730941705}, {"type": "manhattan_recall", "value": 75.75197889182058}, {"type": "max_accuracy", "value": 86.52321630804077}, {"type": "max_ap", "value": 75.22925730473472}, {"type": "max_f1", "value": 69.60000000000001}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterURLCorpus", "type": "mteb/twitterurlcorpus-pairclassification", "config": "default", "split": "test", "revision": "8b6510b0b1fa4e4c4f879467980e9be563ec1cdf"}, "metrics": [{"type": "cos_sim_accuracy", "value": 89.34877944657896}, {"type": "cos_sim_ap", "value": 86.71257569277373}, {"type": "cos_sim_f1", "value": 79.10386355986088}, {"type": "cos_sim_precision", "value": 76.91468470434214}, {"type": "cos_sim_recall", "value": 81.4213119802895}, {"type": "dot_accuracy", "value": 89.34877944657896}, {"type": "dot_ap", "value": 86.71257133133368}, {"type": "dot_f1", "value": 79.10386355986088}, {"type": "dot_precision", "value": 76.91468470434214}, {"type": "dot_recall", "value": 81.4213119802895}, {"type": "euclidean_accuracy", "value": 89.34877944657896}, {"type": "euclidean_ap", "value": 86.71257651501476}, {"type": "euclidean_f1", "value": 79.10386355986088}, {"type": "euclidean_precision", "value": 76.91468470434214}, {"type": "euclidean_recall", "value": 81.4213119802895}, {"type": "manhattan_accuracy", "value": 89.35848177901967}, {"type": "manhattan_ap", "value": 86.69330615469126}, {"type": "manhattan_f1", "value": 79.13867741453949}, {"type": "manhattan_precision", "value": 76.78881807647741}, {"type": "manhattan_recall", "value": 81.63689559593472}, {"type": "max_accuracy", "value": 89.35848177901967}, {"type": "max_ap", "value": 86.71257651501476}, {"type": "max_f1", "value": 79.13867741453949}]}]}]} |
Muennighoff/SGPT-2.7B-weightedmean-msmarco-specb-bitfit | Muennighoff | sentence-similarity | [
"sentence-transformers",
"pytorch",
"gpt_neo",
"feature-extraction",
"sentence-similarity",
"mteb",
"arxiv:2202.08904",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| 2022-03-02T23:29:04 | 2023-03-27T22:24:48 | 30 | 3 | ---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- mteb
model-index:
- name: SGPT-2.7B-weightedmean-msmarco-specb-bitfit
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: 2d8a100785abf0ae21420d2a55b0c56e3e1ea996
metrics:
- type: accuracy
value: 67.56716417910448
- type: ap
value: 30.75574629595259
- type: f1
value: 61.805121301858655
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: 80714f8dcf8cefc218ef4f8c5a966dd83f75a0e1
metrics:
- type: accuracy
value: 71.439575
- type: ap
value: 65.91341330532453
- type: f1
value: 70.90561852619555
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: c379a6705fec24a2493fa68e011692605f44e119
metrics:
- type: accuracy
value: 35.748000000000005
- type: f1
value: 35.48576287186347
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: 5b3e3697907184a9b77a3c99ee9ea1a9cbb1e4e3
metrics:
- type: map_at_1
value: 25.96
- type: map_at_10
value: 41.619
- type: map_at_100
value: 42.673
- type: map_at_1000
value: 42.684
- type: map_at_3
value: 36.569
- type: map_at_5
value: 39.397
- type: mrr_at_1
value: 26.316
- type: mrr_at_10
value: 41.772
- type: mrr_at_100
value: 42.82
- type: mrr_at_1000
value: 42.83
- type: mrr_at_3
value: 36.724000000000004
- type: mrr_at_5
value: 39.528999999999996
- type: ndcg_at_1
value: 25.96
- type: ndcg_at_10
value: 50.491
- type: ndcg_at_100
value: 54.864999999999995
- type: ndcg_at_1000
value: 55.10699999999999
- type: ndcg_at_3
value: 40.053
- type: ndcg_at_5
value: 45.134
- type: precision_at_1
value: 25.96
- type: precision_at_10
value: 7.8950000000000005
- type: precision_at_100
value: 0.9780000000000001
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 16.714000000000002
- type: precision_at_5
value: 12.489
- type: recall_at_1
value: 25.96
- type: recall_at_10
value: 78.947
- type: recall_at_100
value: 97.795
- type: recall_at_1000
value: 99.644
- type: recall_at_3
value: 50.141999999999996
- type: recall_at_5
value: 62.446999999999996
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: 0bbdb47bcbe3a90093699aefeed338a0f28a7ee8
metrics:
- type: v_measure
value: 44.72125714642202
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: b73bd54100e5abfa6e3a23dcafb46fe4d2438dc3
metrics:
- type: v_measure
value: 35.081451519142064
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 4d853f94cd57d85ec13805aeeac3ae3e5eb4c49c
metrics:
- type: map
value: 59.634661990392054
- type: mrr
value: 73.6813525040672
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: 9ee918f184421b6bd48b78f6c714d86546106103
metrics:
- type: cos_sim_pearson
value: 87.42754550496836
- type: cos_sim_spearman
value: 84.84289705838664
- type: euclidean_pearson
value: 85.59331970450859
- type: euclidean_spearman
value: 85.8525586184271
- type: manhattan_pearson
value: 85.41233134466698
- type: manhattan_spearman
value: 85.52303303767404
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 44fa15921b4c889113cc5df03dd4901b49161ab7
metrics:
- type: accuracy
value: 83.21753246753246
- type: f1
value: 83.15394543120915
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 11d0121201d1f1f280e8cc8f3d98fb9c4d9f9c55
metrics:
- type: v_measure
value: 34.41414219680629
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: c0fab014e1bcb8d3a5e31b2088972a1e01547dc1
metrics:
- type: v_measure
value: 30.533275862270028
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 2b9f5791698b5be7bc5e10535c8690f20043c3db
metrics:
- type: map_at_1
value: 30.808999999999997
- type: map_at_10
value: 40.617
- type: map_at_100
value: 41.894999999999996
- type: map_at_1000
value: 42.025
- type: map_at_3
value: 37.0
- type: map_at_5
value: 38.993
- type: mrr_at_1
value: 37.482
- type: mrr_at_10
value: 46.497
- type: mrr_at_100
value: 47.144000000000005
- type: mrr_at_1000
value: 47.189
- type: mrr_at_3
value: 43.705
- type: mrr_at_5
value: 45.193
- type: ndcg_at_1
value: 37.482
- type: ndcg_at_10
value: 46.688
- type: ndcg_at_100
value: 51.726000000000006
- type: ndcg_at_1000
value: 53.825
- type: ndcg_at_3
value: 41.242000000000004
- type: ndcg_at_5
value: 43.657000000000004
- type: precision_at_1
value: 37.482
- type: precision_at_10
value: 8.827
- type: precision_at_100
value: 1.393
- type: precision_at_1000
value: 0.186
- type: precision_at_3
value: 19.361
- type: precision_at_5
value: 14.106
- type: recall_at_1
value: 30.808999999999997
- type: recall_at_10
value: 58.47
- type: recall_at_100
value: 80.51899999999999
- type: recall_at_1000
value: 93.809
- type: recall_at_3
value: 42.462
- type: recall_at_5
value: 49.385
- type: map_at_1
value: 26.962000000000003
- type: map_at_10
value: 36.93
- type: map_at_100
value: 38.102000000000004
- type: map_at_1000
value: 38.22
- type: map_at_3
value: 34.065
- type: map_at_5
value: 35.72
- type: mrr_at_1
value: 33.567
- type: mrr_at_10
value: 42.269
- type: mrr_at_100
value: 42.99
- type: mrr_at_1000
value: 43.033
- type: mrr_at_3
value: 40.064
- type: mrr_at_5
value: 41.258
- type: ndcg_at_1
value: 33.567
- type: ndcg_at_10
value: 42.405
- type: ndcg_at_100
value: 46.847
- type: ndcg_at_1000
value: 48.951
- type: ndcg_at_3
value: 38.312000000000005
- type: ndcg_at_5
value: 40.242
- type: precision_at_1
value: 33.567
- type: precision_at_10
value: 8.032
- type: precision_at_100
value: 1.295
- type: precision_at_1000
value: 0.17600000000000002
- type: precision_at_3
value: 18.662
- type: precision_at_5
value: 13.299
- type: recall_at_1
value: 26.962000000000003
- type: recall_at_10
value: 52.489
- type: recall_at_100
value: 71.635
- type: recall_at_1000
value: 85.141
- type: recall_at_3
value: 40.28
- type: recall_at_5
value: 45.757
- type: map_at_1
value: 36.318
- type: map_at_10
value: 47.97
- type: map_at_100
value: 49.003
- type: map_at_1000
value: 49.065999999999995
- type: map_at_3
value: 45.031
- type: map_at_5
value: 46.633
- type: mrr_at_1
value: 41.504999999999995
- type: mrr_at_10
value: 51.431000000000004
- type: mrr_at_100
value: 52.129000000000005
- type: mrr_at_1000
value: 52.161
- type: mrr_at_3
value: 48.934
- type: mrr_at_5
value: 50.42
- type: ndcg_at_1
value: 41.504999999999995
- type: ndcg_at_10
value: 53.676
- type: ndcg_at_100
value: 57.867000000000004
- type: ndcg_at_1000
value: 59.166
- type: ndcg_at_3
value: 48.516
- type: ndcg_at_5
value: 50.983999999999995
- type: precision_at_1
value: 41.504999999999995
- type: precision_at_10
value: 8.608
- type: precision_at_100
value: 1.1560000000000001
- type: precision_at_1000
value: 0.133
- type: precision_at_3
value: 21.462999999999997
- type: precision_at_5
value: 14.721
- type: recall_at_1
value: 36.318
- type: recall_at_10
value: 67.066
- type: recall_at_100
value: 85.34
- type: recall_at_1000
value: 94.491
- type: recall_at_3
value: 53.215999999999994
- type: recall_at_5
value: 59.214
- type: map_at_1
value: 22.167
- type: map_at_10
value: 29.543999999999997
- type: map_at_100
value: 30.579
- type: map_at_1000
value: 30.669999999999998
- type: map_at_3
value: 26.982
- type: map_at_5
value: 28.474
- type: mrr_at_1
value: 24.068
- type: mrr_at_10
value: 31.237
- type: mrr_at_100
value: 32.222
- type: mrr_at_1000
value: 32.292
- type: mrr_at_3
value: 28.776000000000003
- type: mrr_at_5
value: 30.233999999999998
- type: ndcg_at_1
value: 24.068
- type: ndcg_at_10
value: 33.973
- type: ndcg_at_100
value: 39.135
- type: ndcg_at_1000
value: 41.443999999999996
- type: ndcg_at_3
value: 29.018
- type: ndcg_at_5
value: 31.558999999999997
- type: precision_at_1
value: 24.068
- type: precision_at_10
value: 5.299
- type: precision_at_100
value: 0.823
- type: precision_at_1000
value: 0.106
- type: precision_at_3
value: 12.166
- type: precision_at_5
value: 8.767999999999999
- type: recall_at_1
value: 22.167
- type: recall_at_10
value: 46.115
- type: recall_at_100
value: 69.867
- type: recall_at_1000
value: 87.234
- type: recall_at_3
value: 32.798
- type: recall_at_5
value: 38.951
- type: map_at_1
value: 12.033000000000001
- type: map_at_10
value: 19.314
- type: map_at_100
value: 20.562
- type: map_at_1000
value: 20.695
- type: map_at_3
value: 16.946
- type: map_at_5
value: 18.076999999999998
- type: mrr_at_1
value: 14.801
- type: mrr_at_10
value: 22.74
- type: mrr_at_100
value: 23.876
- type: mrr_at_1000
value: 23.949
- type: mrr_at_3
value: 20.211000000000002
- type: mrr_at_5
value: 21.573
- type: ndcg_at_1
value: 14.801
- type: ndcg_at_10
value: 24.038
- type: ndcg_at_100
value: 30.186
- type: ndcg_at_1000
value: 33.321
- type: ndcg_at_3
value: 19.431
- type: ndcg_at_5
value: 21.34
- type: precision_at_1
value: 14.801
- type: precision_at_10
value: 4.776
- type: precision_at_100
value: 0.897
- type: precision_at_1000
value: 0.133
- type: precision_at_3
value: 9.66
- type: precision_at_5
value: 7.239
- type: recall_at_1
value: 12.033000000000001
- type: recall_at_10
value: 35.098
- type: recall_at_100
value: 62.175000000000004
- type: recall_at_1000
value: 84.17099999999999
- type: recall_at_3
value: 22.61
- type: recall_at_5
value: 27.278999999999996
- type: map_at_1
value: 26.651000000000003
- type: map_at_10
value: 36.901
- type: map_at_100
value: 38.249
- type: map_at_1000
value: 38.361000000000004
- type: map_at_3
value: 33.891
- type: map_at_5
value: 35.439
- type: mrr_at_1
value: 32.724
- type: mrr_at_10
value: 42.504
- type: mrr_at_100
value: 43.391999999999996
- type: mrr_at_1000
value: 43.436
- type: mrr_at_3
value: 39.989999999999995
- type: mrr_at_5
value: 41.347
- type: ndcg_at_1
value: 32.724
- type: ndcg_at_10
value: 43.007
- type: ndcg_at_100
value: 48.601
- type: ndcg_at_1000
value: 50.697
- type: ndcg_at_3
value: 37.99
- type: ndcg_at_5
value: 40.083999999999996
- type: precision_at_1
value: 32.724
- type: precision_at_10
value: 7.872999999999999
- type: precision_at_100
value: 1.247
- type: precision_at_1000
value: 0.16199999999999998
- type: precision_at_3
value: 18.062
- type: precision_at_5
value: 12.666
- type: recall_at_1
value: 26.651000000000003
- type: recall_at_10
value: 55.674
- type: recall_at_100
value: 78.904
- type: recall_at_1000
value: 92.55799999999999
- type: recall_at_3
value: 41.36
- type: recall_at_5
value: 46.983999999999995
- type: map_at_1
value: 22.589000000000002
- type: map_at_10
value: 32.244
- type: map_at_100
value: 33.46
- type: map_at_1000
value: 33.593
- type: map_at_3
value: 29.21
- type: map_at_5
value: 31.019999999999996
- type: mrr_at_1
value: 28.425
- type: mrr_at_10
value: 37.282
- type: mrr_at_100
value: 38.187
- type: mrr_at_1000
value: 38.248
- type: mrr_at_3
value: 34.684
- type: mrr_at_5
value: 36.123
- type: ndcg_at_1
value: 28.425
- type: ndcg_at_10
value: 37.942
- type: ndcg_at_100
value: 43.443
- type: ndcg_at_1000
value: 45.995999999999995
- type: ndcg_at_3
value: 32.873999999999995
- type: ndcg_at_5
value: 35.325
- type: precision_at_1
value: 28.425
- type: precision_at_10
value: 7.1
- type: precision_at_100
value: 1.166
- type: precision_at_1000
value: 0.158
- type: precision_at_3
value: 16.02
- type: precision_at_5
value: 11.644
- type: recall_at_1
value: 22.589000000000002
- type: recall_at_10
value: 50.03999999999999
- type: recall_at_100
value: 73.973
- type: recall_at_1000
value: 91.128
- type: recall_at_3
value: 35.882999999999996
- type: recall_at_5
value: 42.187999999999995
- type: map_at_1
value: 23.190833333333334
- type: map_at_10
value: 31.504916666666666
- type: map_at_100
value: 32.64908333333334
- type: map_at_1000
value: 32.77075
- type: map_at_3
value: 28.82575
- type: map_at_5
value: 30.2755
- type: mrr_at_1
value: 27.427499999999995
- type: mrr_at_10
value: 35.36483333333334
- type: mrr_at_100
value: 36.23441666666666
- type: mrr_at_1000
value: 36.297583333333336
- type: mrr_at_3
value: 32.97966666666667
- type: mrr_at_5
value: 34.294583333333335
- type: ndcg_at_1
value: 27.427499999999995
- type: ndcg_at_10
value: 36.53358333333333
- type: ndcg_at_100
value: 41.64508333333333
- type: ndcg_at_1000
value: 44.14499999999999
- type: ndcg_at_3
value: 31.88908333333333
- type: ndcg_at_5
value: 33.98433333333333
- type: precision_at_1
value: 27.427499999999995
- type: precision_at_10
value: 6.481083333333333
- type: precision_at_100
value: 1.0610833333333334
- type: precision_at_1000
value: 0.14691666666666667
- type: precision_at_3
value: 14.656749999999999
- type: precision_at_5
value: 10.493583333333332
- type: recall_at_1
value: 23.190833333333334
- type: recall_at_10
value: 47.65175
- type: recall_at_100
value: 70.41016666666667
- type: recall_at_1000
value: 87.82708333333332
- type: recall_at_3
value: 34.637583333333325
- type: recall_at_5
value: 40.05008333333333
- type: map_at_1
value: 20.409
- type: map_at_10
value: 26.794
- type: map_at_100
value: 27.682000000000002
- type: map_at_1000
value: 27.783
- type: map_at_3
value: 24.461
- type: map_at_5
value: 25.668000000000003
- type: mrr_at_1
value: 22.853
- type: mrr_at_10
value: 29.296
- type: mrr_at_100
value: 30.103
- type: mrr_at_1000
value: 30.179000000000002
- type: mrr_at_3
value: 27.173000000000002
- type: mrr_at_5
value: 28.223
- type: ndcg_at_1
value: 22.853
- type: ndcg_at_10
value: 31.007
- type: ndcg_at_100
value: 35.581
- type: ndcg_at_1000
value: 38.147
- type: ndcg_at_3
value: 26.590999999999998
- type: ndcg_at_5
value: 28.43
- type: precision_at_1
value: 22.853
- type: precision_at_10
value: 5.031
- type: precision_at_100
value: 0.7939999999999999
- type: precision_at_1000
value: 0.11
- type: precision_at_3
value: 11.401
- type: precision_at_5
value: 8.16
- type: recall_at_1
value: 20.409
- type: recall_at_10
value: 41.766
- type: recall_at_100
value: 62.964
- type: recall_at_1000
value: 81.682
- type: recall_at_3
value: 29.281000000000002
- type: recall_at_5
value: 33.83
- type: map_at_1
value: 14.549000000000001
- type: map_at_10
value: 20.315
- type: map_at_100
value: 21.301000000000002
- type: map_at_1000
value: 21.425
- type: map_at_3
value: 18.132
- type: map_at_5
value: 19.429
- type: mrr_at_1
value: 17.86
- type: mrr_at_10
value: 23.860999999999997
- type: mrr_at_100
value: 24.737000000000002
- type: mrr_at_1000
value: 24.82
- type: mrr_at_3
value: 21.685
- type: mrr_at_5
value: 23.008
- type: ndcg_at_1
value: 17.86
- type: ndcg_at_10
value: 24.396
- type: ndcg_at_100
value: 29.328
- type: ndcg_at_1000
value: 32.486
- type: ndcg_at_3
value: 20.375
- type: ndcg_at_5
value: 22.411
- type: precision_at_1
value: 17.86
- type: precision_at_10
value: 4.47
- type: precision_at_100
value: 0.8099999999999999
- type: precision_at_1000
value: 0.125
- type: precision_at_3
value: 9.475
- type: precision_at_5
value: 7.170999999999999
- type: recall_at_1
value: 14.549000000000001
- type: recall_at_10
value: 33.365
- type: recall_at_100
value: 55.797
- type: recall_at_1000
value: 78.632
- type: recall_at_3
value: 22.229
- type: recall_at_5
value: 27.339000000000002
- type: map_at_1
value: 23.286
- type: map_at_10
value: 30.728
- type: map_at_100
value: 31.840000000000003
- type: map_at_1000
value: 31.953
- type: map_at_3
value: 28.302
- type: map_at_5
value: 29.615000000000002
- type: mrr_at_1
value: 27.239
- type: mrr_at_10
value: 34.408
- type: mrr_at_100
value: 35.335
- type: mrr_at_1000
value: 35.405
- type: mrr_at_3
value: 32.151999999999994
- type: mrr_at_5
value: 33.355000000000004
- type: ndcg_at_1
value: 27.239
- type: ndcg_at_10
value: 35.324
- type: ndcg_at_100
value: 40.866
- type: ndcg_at_1000
value: 43.584
- type: ndcg_at_3
value: 30.898999999999997
- type: ndcg_at_5
value: 32.812999999999995
- type: precision_at_1
value: 27.239
- type: precision_at_10
value: 5.896
- type: precision_at_100
value: 0.979
- type: precision_at_1000
value: 0.133
- type: precision_at_3
value: 13.713000000000001
- type: precision_at_5
value: 9.683
- type: recall_at_1
value: 23.286
- type: recall_at_10
value: 45.711
- type: recall_at_100
value: 70.611
- type: recall_at_1000
value: 90.029
- type: recall_at_3
value: 33.615
- type: recall_at_5
value: 38.41
- type: map_at_1
value: 23.962
- type: map_at_10
value: 31.942999999999998
- type: map_at_100
value: 33.384
- type: map_at_1000
value: 33.611000000000004
- type: map_at_3
value: 29.243000000000002
- type: map_at_5
value: 30.446
- type: mrr_at_1
value: 28.458
- type: mrr_at_10
value: 36.157000000000004
- type: mrr_at_100
value: 37.092999999999996
- type: mrr_at_1000
value: 37.163000000000004
- type: mrr_at_3
value: 33.86
- type: mrr_at_5
value: 35.086
- type: ndcg_at_1
value: 28.458
- type: ndcg_at_10
value: 37.201
- type: ndcg_at_100
value: 42.591
- type: ndcg_at_1000
value: 45.539
- type: ndcg_at_3
value: 32.889
- type: ndcg_at_5
value: 34.483000000000004
- type: precision_at_1
value: 28.458
- type: precision_at_10
value: 7.332
- type: precision_at_100
value: 1.437
- type: precision_at_1000
value: 0.233
- type: precision_at_3
value: 15.547
- type: precision_at_5
value: 11.146
- type: recall_at_1
value: 23.962
- type: recall_at_10
value: 46.751
- type: recall_at_100
value: 71.626
- type: recall_at_1000
value: 90.93900000000001
- type: recall_at_3
value: 34.138000000000005
- type: recall_at_5
value: 38.673
- type: map_at_1
value: 18.555
- type: map_at_10
value: 24.759
- type: map_at_100
value: 25.732
- type: map_at_1000
value: 25.846999999999998
- type: map_at_3
value: 22.646
- type: map_at_5
value: 23.791999999999998
- type: mrr_at_1
value: 20.148
- type: mrr_at_10
value: 26.695999999999998
- type: mrr_at_100
value: 27.605
- type: mrr_at_1000
value: 27.695999999999998
- type: mrr_at_3
value: 24.522
- type: mrr_at_5
value: 25.715
- type: ndcg_at_1
value: 20.148
- type: ndcg_at_10
value: 28.746
- type: ndcg_at_100
value: 33.57
- type: ndcg_at_1000
value: 36.584
- type: ndcg_at_3
value: 24.532
- type: ndcg_at_5
value: 26.484
- type: precision_at_1
value: 20.148
- type: precision_at_10
value: 4.529
- type: precision_at_100
value: 0.736
- type: precision_at_1000
value: 0.108
- type: precision_at_3
value: 10.351
- type: precision_at_5
value: 7.32
- type: recall_at_1
value: 18.555
- type: recall_at_10
value: 39.275999999999996
- type: recall_at_100
value: 61.511
- type: recall_at_1000
value: 84.111
- type: recall_at_3
value: 27.778999999999996
- type: recall_at_5
value: 32.591
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: 392b78eb68c07badcd7c2cd8f39af108375dfcce
metrics:
- type: map_at_1
value: 10.366999999999999
- type: map_at_10
value: 18.953999999999997
- type: map_at_100
value: 20.674999999999997
- type: map_at_1000
value: 20.868000000000002
- type: map_at_3
value: 15.486
- type: map_at_5
value: 17.347
- type: mrr_at_1
value: 23.257
- type: mrr_at_10
value: 35.419
- type: mrr_at_100
value: 36.361
- type: mrr_at_1000
value: 36.403
- type: mrr_at_3
value: 31.747999999999998
- type: mrr_at_5
value: 34.077
- type: ndcg_at_1
value: 23.257
- type: ndcg_at_10
value: 27.11
- type: ndcg_at_100
value: 33.981
- type: ndcg_at_1000
value: 37.444
- type: ndcg_at_3
value: 21.471999999999998
- type: ndcg_at_5
value: 23.769000000000002
- type: precision_at_1
value: 23.257
- type: precision_at_10
value: 8.704
- type: precision_at_100
value: 1.606
- type: precision_at_1000
value: 0.22499999999999998
- type: precision_at_3
value: 16.287
- type: precision_at_5
value: 13.068
- type: recall_at_1
value: 10.366999999999999
- type: recall_at_10
value: 33.706
- type: recall_at_100
value: 57.375
- type: recall_at_1000
value: 76.79
- type: recall_at_3
value: 20.18
- type: recall_at_5
value: 26.215
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: f097057d03ed98220bc7309ddb10b71a54d667d6
metrics:
- type: map_at_1
value: 8.246
- type: map_at_10
value: 15.979
- type: map_at_100
value: 21.025
- type: map_at_1000
value: 22.189999999999998
- type: map_at_3
value: 11.997
- type: map_at_5
value: 13.697000000000001
- type: mrr_at_1
value: 60.75000000000001
- type: mrr_at_10
value: 68.70100000000001
- type: mrr_at_100
value: 69.1
- type: mrr_at_1000
value: 69.111
- type: mrr_at_3
value: 66.583
- type: mrr_at_5
value: 67.87100000000001
- type: ndcg_at_1
value: 49.75
- type: ndcg_at_10
value: 34.702
- type: ndcg_at_100
value: 37.607
- type: ndcg_at_1000
value: 44.322
- type: ndcg_at_3
value: 39.555
- type: ndcg_at_5
value: 36.684
- type: precision_at_1
value: 60.75000000000001
- type: precision_at_10
value: 26.625
- type: precision_at_100
value: 7.969999999999999
- type: precision_at_1000
value: 1.678
- type: precision_at_3
value: 41.833
- type: precision_at_5
value: 34.5
- type: recall_at_1
value: 8.246
- type: recall_at_10
value: 20.968
- type: recall_at_100
value: 42.065000000000005
- type: recall_at_1000
value: 63.671
- type: recall_at_3
value: 13.039000000000001
- type: recall_at_5
value: 16.042
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 829147f8f75a25f005913200eb5ed41fae320aa1
metrics:
- type: accuracy
value: 49.214999999999996
- type: f1
value: 44.85952451163755
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: 1429cf27e393599b8b359b9b72c666f96b2525f9
metrics:
- type: map_at_1
value: 56.769000000000005
- type: map_at_10
value: 67.30199999999999
- type: map_at_100
value: 67.692
- type: map_at_1000
value: 67.712
- type: map_at_3
value: 65.346
- type: map_at_5
value: 66.574
- type: mrr_at_1
value: 61.370999999999995
- type: mrr_at_10
value: 71.875
- type: mrr_at_100
value: 72.195
- type: mrr_at_1000
value: 72.206
- type: mrr_at_3
value: 70.04
- type: mrr_at_5
value: 71.224
- type: ndcg_at_1
value: 61.370999999999995
- type: ndcg_at_10
value: 72.731
- type: ndcg_at_100
value: 74.468
- type: ndcg_at_1000
value: 74.91600000000001
- type: ndcg_at_3
value: 69.077
- type: ndcg_at_5
value: 71.111
- type: precision_at_1
value: 61.370999999999995
- type: precision_at_10
value: 9.325999999999999
- type: precision_at_100
value: 1.03
- type: precision_at_1000
value: 0.108
- type: precision_at_3
value: 27.303
- type: precision_at_5
value: 17.525
- type: recall_at_1
value: 56.769000000000005
- type: recall_at_10
value: 85.06
- type: recall_at_100
value: 92.767
- type: recall_at_1000
value: 95.933
- type: recall_at_3
value: 75.131
- type: recall_at_5
value: 80.17
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: 41b686a7f28c59bcaaa5791efd47c67c8ebe28be
metrics:
- type: map_at_1
value: 15.753
- type: map_at_10
value: 25.875999999999998
- type: map_at_100
value: 27.415
- type: map_at_1000
value: 27.590999999999998
- type: map_at_3
value: 22.17
- type: map_at_5
value: 24.236
- type: mrr_at_1
value: 31.019000000000002
- type: mrr_at_10
value: 39.977000000000004
- type: mrr_at_100
value: 40.788999999999994
- type: mrr_at_1000
value: 40.832
- type: mrr_at_3
value: 37.088
- type: mrr_at_5
value: 38.655
- type: ndcg_at_1
value: 31.019000000000002
- type: ndcg_at_10
value: 33.286
- type: ndcg_at_100
value: 39.528999999999996
- type: ndcg_at_1000
value: 42.934
- type: ndcg_at_3
value: 29.29
- type: ndcg_at_5
value: 30.615
- type: precision_at_1
value: 31.019000000000002
- type: precision_at_10
value: 9.383
- type: precision_at_100
value: 1.6019999999999999
- type: precision_at_1000
value: 0.22200000000000003
- type: precision_at_3
value: 19.753
- type: precision_at_5
value: 14.815000000000001
- type: recall_at_1
value: 15.753
- type: recall_at_10
value: 40.896
- type: recall_at_100
value: 64.443
- type: recall_at_1000
value: 85.218
- type: recall_at_3
value: 26.526
- type: recall_at_5
value: 32.452999999999996
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: 766870b35a1b9ca65e67a0d1913899973551fc6c
metrics:
- type: map_at_1
value: 32.153999999999996
- type: map_at_10
value: 43.651
- type: map_at_100
value: 44.41
- type: map_at_1000
value: 44.487
- type: map_at_3
value: 41.239
- type: map_at_5
value: 42.659000000000006
- type: mrr_at_1
value: 64.30799999999999
- type: mrr_at_10
value: 71.22500000000001
- type: mrr_at_100
value: 71.57
- type: mrr_at_1000
value: 71.59100000000001
- type: mrr_at_3
value: 69.95
- type: mrr_at_5
value: 70.738
- type: ndcg_at_1
value: 64.30799999999999
- type: ndcg_at_10
value: 52.835
- type: ndcg_at_100
value: 55.840999999999994
- type: ndcg_at_1000
value: 57.484
- type: ndcg_at_3
value: 49.014
- type: ndcg_at_5
value: 51.01599999999999
- type: precision_at_1
value: 64.30799999999999
- type: precision_at_10
value: 10.77
- type: precision_at_100
value: 1.315
- type: precision_at_1000
value: 0.153
- type: precision_at_3
value: 30.223
- type: precision_at_5
value: 19.716
- type: recall_at_1
value: 32.153999999999996
- type: recall_at_10
value: 53.849000000000004
- type: recall_at_100
value: 65.75999999999999
- type: recall_at_1000
value: 76.705
- type: recall_at_3
value: 45.334
- type: recall_at_5
value: 49.291000000000004
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 8d743909f834c38949e8323a8a6ce8721ea6c7f4
metrics:
- type: accuracy
value: 63.5316
- type: ap
value: 58.90084300359825
- type: f1
value: 63.35727889030892
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: validation
revision: e6838a846e2408f22cf5cc337ebc83e0bcf77849
metrics:
- type: map_at_1
value: 20.566000000000003
- type: map_at_10
value: 32.229
- type: map_at_100
value: 33.445
- type: map_at_1000
value: 33.501
- type: map_at_3
value: 28.504
- type: map_at_5
value: 30.681000000000004
- type: mrr_at_1
value: 21.218
- type: mrr_at_10
value: 32.816
- type: mrr_at_100
value: 33.986
- type: mrr_at_1000
value: 34.035
- type: mrr_at_3
value: 29.15
- type: mrr_at_5
value: 31.290000000000003
- type: ndcg_at_1
value: 21.218
- type: ndcg_at_10
value: 38.832
- type: ndcg_at_100
value: 44.743
- type: ndcg_at_1000
value: 46.138
- type: ndcg_at_3
value: 31.232
- type: ndcg_at_5
value: 35.099999999999994
- type: precision_at_1
value: 21.218
- type: precision_at_10
value: 6.186
- type: precision_at_100
value: 0.914
- type: precision_at_1000
value: 0.10300000000000001
- type: precision_at_3
value: 13.314
- type: precision_at_5
value: 9.943
- type: recall_at_1
value: 20.566000000000003
- type: recall_at_10
value: 59.192
- type: recall_at_100
value: 86.626
- type: recall_at_1000
value: 97.283
- type: recall_at_3
value: 38.492
- type: recall_at_5
value: 47.760000000000005
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: a7e2a951126a26fc8c6a69f835f33a346ba259e3
metrics:
- type: accuracy
value: 92.56269949840402
- type: f1
value: 92.1020975473988
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: 6299947a7777084cc2d4b64235bf7190381ce755
metrics:
- type: accuracy
value: 71.8467852257182
- type: f1
value: 53.652719348592015
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 072a486a144adf7f4479a4a0dddb2152e161e1ea
metrics:
- type: accuracy
value: 69.00806993947546
- type: f1
value: 67.41429618885515
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 75.90114324142569
- type: f1
value: 76.25183590651454
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: dcefc037ef84348e49b0d29109e891c01067226b
metrics:
- type: v_measure
value: 31.350109978273395
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 3cd0e71dfbe09d4de0f9e5ecba43e7ce280959dc
metrics:
- type: v_measure
value: 28.768923695767327
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 31.716396735210754
- type: mrr
value: 32.88970538547634
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: 7eb63cc0c1eb59324d709ebed25fcab851fa7610
metrics:
- type: map_at_1
value: 5.604
- type: map_at_10
value: 12.379999999999999
- type: map_at_100
value: 15.791
- type: map_at_1000
value: 17.327
- type: map_at_3
value: 9.15
- type: map_at_5
value: 10.599
- type: mrr_at_1
value: 45.201
- type: mrr_at_10
value: 53.374
- type: mrr_at_100
value: 54.089
- type: mrr_at_1000
value: 54.123
- type: mrr_at_3
value: 51.44499999999999
- type: mrr_at_5
value: 52.59
- type: ndcg_at_1
value: 42.879
- type: ndcg_at_10
value: 33.891
- type: ndcg_at_100
value: 31.391999999999996
- type: ndcg_at_1000
value: 40.36
- type: ndcg_at_3
value: 39.076
- type: ndcg_at_5
value: 37.047000000000004
- type: precision_at_1
value: 44.582
- type: precision_at_10
value: 25.294
- type: precision_at_100
value: 8.285
- type: precision_at_1000
value: 2.1479999999999997
- type: precision_at_3
value: 36.120000000000005
- type: precision_at_5
value: 31.95
- type: recall_at_1
value: 5.604
- type: recall_at_10
value: 16.239
- type: recall_at_100
value: 32.16
- type: recall_at_1000
value: 64.513
- type: recall_at_3
value: 10.406
- type: recall_at_5
value: 12.684999999999999
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: 6062aefc120bfe8ece5897809fb2e53bfe0d128c
metrics:
- type: map_at_1
value: 25.881
- type: map_at_10
value: 39.501
- type: map_at_100
value: 40.615
- type: map_at_1000
value: 40.661
- type: map_at_3
value: 35.559000000000005
- type: map_at_5
value: 37.773
- type: mrr_at_1
value: 29.229
- type: mrr_at_10
value: 41.955999999999996
- type: mrr_at_100
value: 42.86
- type: mrr_at_1000
value: 42.893
- type: mrr_at_3
value: 38.562000000000005
- type: mrr_at_5
value: 40.542
- type: ndcg_at_1
value: 29.2
- type: ndcg_at_10
value: 46.703
- type: ndcg_at_100
value: 51.644
- type: ndcg_at_1000
value: 52.771
- type: ndcg_at_3
value: 39.141999999999996
- type: ndcg_at_5
value: 42.892
- type: precision_at_1
value: 29.2
- type: precision_at_10
value: 7.920000000000001
- type: precision_at_100
value: 1.0659999999999998
- type: precision_at_1000
value: 0.117
- type: precision_at_3
value: 18.105
- type: precision_at_5
value: 13.036
- type: recall_at_1
value: 25.881
- type: recall_at_10
value: 66.266
- type: recall_at_100
value: 88.116
- type: recall_at_1000
value: 96.58200000000001
- type: recall_at_3
value: 46.526
- type: recall_at_5
value: 55.154
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: 6205996560df11e3a3da9ab4f926788fc30a7db4
metrics:
- type: map_at_1
value: 67.553
- type: map_at_10
value: 81.34
- type: map_at_100
value: 82.002
- type: map_at_1000
value: 82.027
- type: map_at_3
value: 78.281
- type: map_at_5
value: 80.149
- type: mrr_at_1
value: 77.72
- type: mrr_at_10
value: 84.733
- type: mrr_at_100
value: 84.878
- type: mrr_at_1000
value: 84.879
- type: mrr_at_3
value: 83.587
- type: mrr_at_5
value: 84.32600000000001
- type: ndcg_at_1
value: 77.75
- type: ndcg_at_10
value: 85.603
- type: ndcg_at_100
value: 87.069
- type: ndcg_at_1000
value: 87.25
- type: ndcg_at_3
value: 82.303
- type: ndcg_at_5
value: 84.03699999999999
- type: precision_at_1
value: 77.75
- type: precision_at_10
value: 13.04
- type: precision_at_100
value: 1.5070000000000001
- type: precision_at_1000
value: 0.156
- type: precision_at_3
value: 35.903
- type: precision_at_5
value: 23.738
- type: recall_at_1
value: 67.553
- type: recall_at_10
value: 93.903
- type: recall_at_100
value: 99.062
- type: recall_at_1000
value: 99.935
- type: recall_at_3
value: 84.58099999999999
- type: recall_at_5
value: 89.316
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: b2805658ae38990172679479369a78b86de8c390
metrics:
- type: v_measure
value: 46.46887711230235
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 385e3cb46b4cfa89021f56c4380204149d0efe33
metrics:
- type: v_measure
value: 54.166876298246926
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: 5c59ef3e437a0a9651c8fe6fde943e7dce59fba5
metrics:
- type: map_at_1
value: 4.053
- type: map_at_10
value: 9.693999999999999
- type: map_at_100
value: 11.387
- type: map_at_1000
value: 11.654
- type: map_at_3
value: 7.053
- type: map_at_5
value: 8.439
- type: mrr_at_1
value: 19.900000000000002
- type: mrr_at_10
value: 29.359
- type: mrr_at_100
value: 30.484
- type: mrr_at_1000
value: 30.553
- type: mrr_at_3
value: 26.200000000000003
- type: mrr_at_5
value: 28.115000000000002
- type: ndcg_at_1
value: 19.900000000000002
- type: ndcg_at_10
value: 16.575
- type: ndcg_at_100
value: 23.655
- type: ndcg_at_1000
value: 28.853
- type: ndcg_at_3
value: 15.848
- type: ndcg_at_5
value: 14.026
- type: precision_at_1
value: 19.900000000000002
- type: precision_at_10
value: 8.450000000000001
- type: precision_at_100
value: 1.872
- type: precision_at_1000
value: 0.313
- type: precision_at_3
value: 14.667
- type: precision_at_5
value: 12.32
- type: recall_at_1
value: 4.053
- type: recall_at_10
value: 17.169999999999998
- type: recall_at_100
value: 38.025
- type: recall_at_1000
value: 63.571999999999996
- type: recall_at_3
value: 8.903
- type: recall_at_5
value: 12.477
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: 20a6d6f312dd54037fe07a32d58e5e168867909d
metrics:
- type: cos_sim_pearson
value: 77.7548748519677
- type: cos_sim_spearman
value: 68.19926431966059
- type: euclidean_pearson
value: 71.69016204991725
- type: euclidean_spearman
value: 66.98099673026834
- type: manhattan_pearson
value: 71.62994072488664
- type: manhattan_spearman
value: 67.03435950744577
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: fdf84275bb8ce4b49c971d02e84dd1abc677a50f
metrics:
- type: cos_sim_pearson
value: 75.91051402657887
- type: cos_sim_spearman
value: 66.99390786191645
- type: euclidean_pearson
value: 71.54128036454578
- type: euclidean_spearman
value: 69.25605675649068
- type: manhattan_pearson
value: 71.60981030780171
- type: manhattan_spearman
value: 69.27513670128046
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 1591bfcbe8c69d4bf7fe2a16e2451017832cafb9
metrics:
- type: cos_sim_pearson
value: 77.23835466417793
- type: cos_sim_spearman
value: 77.57623085766706
- type: euclidean_pearson
value: 77.5090992200725
- type: euclidean_spearman
value: 77.88601688144924
- type: manhattan_pearson
value: 77.39045060647423
- type: manhattan_spearman
value: 77.77552718279098
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: e2125984e7df8b7871f6ae9949cf6b6795e7c54b
metrics:
- type: cos_sim_pearson
value: 77.91692485139602
- type: cos_sim_spearman
value: 72.78258293483495
- type: euclidean_pearson
value: 74.64773017077789
- type: euclidean_spearman
value: 71.81662299104619
- type: manhattan_pearson
value: 74.71043337995533
- type: manhattan_spearman
value: 71.83960860845646
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: 1cd7298cac12a96a373b6a2f18738bb3e739a9b6
metrics:
- type: cos_sim_pearson
value: 82.13422113617578
- type: cos_sim_spearman
value: 82.61707296911949
- type: euclidean_pearson
value: 81.42487480400861
- type: euclidean_spearman
value: 82.17970991273835
- type: manhattan_pearson
value: 81.41985055477845
- type: manhattan_spearman
value: 82.15823204362937
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 360a0b2dff98700d09e634a01e1cc1624d3e42cd
metrics:
- type: cos_sim_pearson
value: 79.07989542843826
- type: cos_sim_spearman
value: 80.09839524406284
- type: euclidean_pearson
value: 76.43186028364195
- type: euclidean_spearman
value: 76.76720323266471
- type: manhattan_pearson
value: 76.4674747409161
- type: manhattan_spearman
value: 76.81797407068667
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: 9fc37e8c632af1c87a3d23e685d49552a02582a0
metrics:
- type: cos_sim_pearson
value: 87.0420983224933
- type: cos_sim_spearman
value: 87.25017540413702
- type: euclidean_pearson
value: 84.56384596473421
- type: euclidean_spearman
value: 84.72557417564886
- type: manhattan_pearson
value: 84.7329954474549
- type: manhattan_spearman
value: 84.75071371008909
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 2de6ce8c1921b71a755b262c6b57fef195dd7906
metrics:
- type: cos_sim_pearson
value: 68.47031320016424
- type: cos_sim_spearman
value: 68.7486910762485
- type: euclidean_pearson
value: 71.30330985913915
- type: euclidean_spearman
value: 71.59666258520735
- type: manhattan_pearson
value: 71.4423884279027
- type: manhattan_spearman
value: 71.67460706861044
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: 8913289635987208e6e7c72789e4be2fe94b6abd
metrics:
- type: cos_sim_pearson
value: 80.79514366062675
- type: cos_sim_spearman
value: 79.20585637461048
- type: euclidean_pearson
value: 78.6591557395699
- type: euclidean_spearman
value: 77.86455794285718
- type: manhattan_pearson
value: 78.67754806486865
- type: manhattan_spearman
value: 77.88178687200732
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: 56a6d0140cf6356659e2a7c1413286a774468d44
metrics:
- type: map
value: 77.71580844366375
- type: mrr
value: 93.04215845882513
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: a75ae049398addde9b70f6b268875f5cbce99089
metrics:
- type: map_at_1
value: 56.39999999999999
- type: map_at_10
value: 65.701
- type: map_at_100
value: 66.32000000000001
- type: map_at_1000
value: 66.34100000000001
- type: map_at_3
value: 62.641999999999996
- type: map_at_5
value: 64.342
- type: mrr_at_1
value: 58.667
- type: mrr_at_10
value: 66.45299999999999
- type: mrr_at_100
value: 66.967
- type: mrr_at_1000
value: 66.988
- type: mrr_at_3
value: 64.11099999999999
- type: mrr_at_5
value: 65.411
- type: ndcg_at_1
value: 58.667
- type: ndcg_at_10
value: 70.165
- type: ndcg_at_100
value: 72.938
- type: ndcg_at_1000
value: 73.456
- type: ndcg_at_3
value: 64.79
- type: ndcg_at_5
value: 67.28
- type: precision_at_1
value: 58.667
- type: precision_at_10
value: 9.4
- type: precision_at_100
value: 1.087
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 24.889
- type: precision_at_5
value: 16.667
- type: recall_at_1
value: 56.39999999999999
- type: recall_at_10
value: 83.122
- type: recall_at_100
value: 95.667
- type: recall_at_1000
value: 99.667
- type: recall_at_3
value: 68.378
- type: recall_at_5
value: 74.68299999999999
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: 5a8256d0dff9c4bd3be3ba3e67e4e70173f802ea
metrics:
- type: cos_sim_accuracy
value: 99.76831683168317
- type: cos_sim_ap
value: 93.47124923047998
- type: cos_sim_f1
value: 88.06122448979592
- type: cos_sim_precision
value: 89.89583333333333
- type: cos_sim_recall
value: 86.3
- type: dot_accuracy
value: 99.57326732673268
- type: dot_ap
value: 84.06577868167207
- type: dot_f1
value: 77.82629791363416
- type: dot_precision
value: 75.58906691800189
- type: dot_recall
value: 80.2
- type: euclidean_accuracy
value: 99.74257425742574
- type: euclidean_ap
value: 92.1904681653555
- type: euclidean_f1
value: 86.74821610601427
- type: euclidean_precision
value: 88.46153846153845
- type: euclidean_recall
value: 85.1
- type: manhattan_accuracy
value: 99.74554455445545
- type: manhattan_ap
value: 92.4337790809948
- type: manhattan_f1
value: 86.86765457332653
- type: manhattan_precision
value: 88.81922675026124
- type: manhattan_recall
value: 85.0
- type: max_accuracy
value: 99.76831683168317
- type: max_ap
value: 93.47124923047998
- type: max_f1
value: 88.06122448979592
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 70a89468f6dccacc6aa2b12a6eac54e74328f235
metrics:
- type: v_measure
value: 59.194098673976484
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: d88009ab563dd0b16cfaf4436abaf97fa3550cf0
metrics:
- type: v_measure
value: 32.5744032578115
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: ef807ea29a75ec4f91b50fd4191cb4ee4589a9f9
metrics:
- type: map
value: 49.61186384154483
- type: mrr
value: 50.55424253034547
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: 8753c2788d36c01fc6f05d03fe3f7268d63f9122
metrics:
- type: cos_sim_pearson
value: 30.027210161713946
- type: cos_sim_spearman
value: 31.030178065751734
- type: dot_pearson
value: 30.09179785685587
- type: dot_spearman
value: 30.408303252207812
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: 2c8041b2c07a79b6f7ba8fe6acc72e5d9f92d217
metrics:
- type: map_at_1
value: 0.22300000000000003
- type: map_at_10
value: 1.762
- type: map_at_100
value: 9.984
- type: map_at_1000
value: 24.265
- type: map_at_3
value: 0.631
- type: map_at_5
value: 0.9950000000000001
- type: mrr_at_1
value: 88.0
- type: mrr_at_10
value: 92.833
- type: mrr_at_100
value: 92.833
- type: mrr_at_1000
value: 92.833
- type: mrr_at_3
value: 92.333
- type: mrr_at_5
value: 92.833
- type: ndcg_at_1
value: 83.0
- type: ndcg_at_10
value: 75.17
- type: ndcg_at_100
value: 55.432
- type: ndcg_at_1000
value: 49.482
- type: ndcg_at_3
value: 82.184
- type: ndcg_at_5
value: 79.712
- type: precision_at_1
value: 88.0
- type: precision_at_10
value: 78.60000000000001
- type: precision_at_100
value: 56.56
- type: precision_at_1000
value: 22.334
- type: precision_at_3
value: 86.667
- type: precision_at_5
value: 83.6
- type: recall_at_1
value: 0.22300000000000003
- type: recall_at_10
value: 1.9879999999999998
- type: recall_at_100
value: 13.300999999999998
- type: recall_at_1000
value: 46.587
- type: recall_at_3
value: 0.6629999999999999
- type: recall_at_5
value: 1.079
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: 527b7d77e16e343303e68cb6af11d6e18b9f7b3b
metrics:
- type: map_at_1
value: 3.047
- type: map_at_10
value: 8.792
- type: map_at_100
value: 14.631
- type: map_at_1000
value: 16.127
- type: map_at_3
value: 4.673
- type: map_at_5
value: 5.897
- type: mrr_at_1
value: 38.775999999999996
- type: mrr_at_10
value: 49.271
- type: mrr_at_100
value: 50.181
- type: mrr_at_1000
value: 50.2
- type: mrr_at_3
value: 44.558
- type: mrr_at_5
value: 47.925000000000004
- type: ndcg_at_1
value: 35.714
- type: ndcg_at_10
value: 23.44
- type: ndcg_at_100
value: 35.345
- type: ndcg_at_1000
value: 46.495
- type: ndcg_at_3
value: 26.146
- type: ndcg_at_5
value: 24.878
- type: precision_at_1
value: 38.775999999999996
- type: precision_at_10
value: 20.816000000000003
- type: precision_at_100
value: 7.428999999999999
- type: precision_at_1000
value: 1.494
- type: precision_at_3
value: 25.85
- type: precision_at_5
value: 24.082
- type: recall_at_1
value: 3.047
- type: recall_at_10
value: 14.975
- type: recall_at_100
value: 45.943
- type: recall_at_1000
value: 80.31099999999999
- type: recall_at_3
value: 5.478000000000001
- type: recall_at_5
value: 8.294
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de
metrics:
- type: accuracy
value: 68.84080000000002
- type: ap
value: 13.135219251019848
- type: f1
value: 52.849999421995506
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: 62146448f05be9e52a36b8ee9936447ea787eede
metrics:
- type: accuracy
value: 56.68647425014149
- type: f1
value: 56.97981427365949
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 091a54f9a36281ce7d6590ec8c75dd485e7e01d4
metrics:
- type: v_measure
value: 40.8911707239219
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 83.04226023722954
- type: cos_sim_ap
value: 63.681339908301325
- type: cos_sim_f1
value: 60.349184470480125
- type: cos_sim_precision
value: 53.437754271765655
- type: cos_sim_recall
value: 69.31398416886545
- type: dot_accuracy
value: 81.46271681468677
- type: dot_ap
value: 57.78072296265885
- type: dot_f1
value: 56.28769265132901
- type: dot_precision
value: 48.7993803253292
- type: dot_recall
value: 66.49076517150397
- type: euclidean_accuracy
value: 82.16606067830959
- type: euclidean_ap
value: 59.974530371203514
- type: euclidean_f1
value: 56.856023506366306
- type: euclidean_precision
value: 53.037916857012334
- type: euclidean_recall
value: 61.2664907651715
- type: manhattan_accuracy
value: 82.16606067830959
- type: manhattan_ap
value: 59.98962379571767
- type: manhattan_f1
value: 56.98153158451947
- type: manhattan_precision
value: 51.41158989598811
- type: manhattan_recall
value: 63.90501319261214
- type: max_accuracy
value: 83.04226023722954
- type: max_ap
value: 63.681339908301325
- type: max_f1
value: 60.349184470480125
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 88.56871191834517
- type: cos_sim_ap
value: 84.80240716354544
- type: cos_sim_f1
value: 77.07765285922385
- type: cos_sim_precision
value: 74.84947406601378
- type: cos_sim_recall
value: 79.44256236526024
- type: dot_accuracy
value: 86.00923662048356
- type: dot_ap
value: 78.6556459012073
- type: dot_f1
value: 72.7583749109052
- type: dot_precision
value: 67.72823779193206
- type: dot_recall
value: 78.59562673236834
- type: euclidean_accuracy
value: 87.84103698529127
- type: euclidean_ap
value: 83.50424424952834
- type: euclidean_f1
value: 75.74496544549307
- type: euclidean_precision
value: 73.19402556369381
- type: euclidean_recall
value: 78.48013550970127
- type: manhattan_accuracy
value: 87.9225365777933
- type: manhattan_ap
value: 83.49479248597825
- type: manhattan_f1
value: 75.67748162447101
- type: manhattan_precision
value: 73.06810035842294
- type: manhattan_recall
value: 78.48013550970127
- type: max_accuracy
value: 88.56871191834517
- type: max_ap
value: 84.80240716354544
- type: max_f1
value: 77.07765285922385
---
# SGPT-2.7B-weightedmean-msmarco-specb-bitfit
## Usage
For usage instructions, refer to our codebase: https://github.com/Muennighoff/sgpt
## Evaluation Results
For eval results, refer to the eval folder or our paper: https://arxiv.org/abs/2202.08904
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 124796 with parameters:
```
{'batch_size': 4, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss` with parameters:
```
{'scale': 20.0, 'similarity_fct': 'cos_sim'}
```
Parameters of the fit()-Method:
```
{
"epochs": 10,
"evaluation_steps": 0,
"evaluator": "NoneType",
"max_grad_norm": 1,
"optimizer_class": "<class 'transformers.optimization.AdamW'>",
"optimizer_params": {
"lr": 7.5e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 1000,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 300, 'do_lower_case': False}) with Transformer model: GPTNeoModel
(1): Pooling({'word_embedding_dimension': 2560, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': True, 'pooling_mode_lasttoken': False})
)
```
## Citing & Authors
```bibtex
@article{muennighoff2022sgpt,
title={SGPT: GPT Sentence Embeddings for Semantic Search},
author={Muennighoff, Niklas},
journal={arXiv preprint arXiv:2202.08904},
year={2022}
}
``` | [
"SUMMARIZATION"
]
| [
"BIOSSES",
"SCIFACT"
]
| Non_BioNLP |
# SGPT-2.7B-weightedmean-msmarco-specb-bitfit
## Usage
For usage instructions, refer to our codebase: https://github.com/Muennighoff/sgpt
## Evaluation Results
For eval results, refer to the eval folder or our paper: https://arxiv.org/abs/2202.08904
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 124796 with parameters:
```
{'batch_size': 4, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss` with parameters:
```
{'scale': 20.0, 'similarity_fct': 'cos_sim'}
```
Parameters of the fit()-Method:
```
{
"epochs": 10,
"evaluation_steps": 0,
"evaluator": "NoneType",
"max_grad_norm": 1,
"optimizer_class": "<class 'transformers.optimization.AdamW'>",
"optimizer_params": {
"lr": 7.5e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 1000,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 300, 'do_lower_case': False}) with Transformer model: GPTNeoModel
(1): Pooling({'word_embedding_dimension': 2560, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': True, 'pooling_mode_lasttoken': False})
)
```
## Citing & Authors
```bibtex
@article{muennighoff2022sgpt,
title={SGPT: GPT Sentence Embeddings for Semantic Search},
author={Muennighoff, Niklas},
journal={arXiv preprint arXiv:2202.08904},
year={2022}
}
``` | {"pipeline_tag": "sentence-similarity", "tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "mteb"], "model-index": [{"name": "SGPT-2.7B-weightedmean-msmarco-specb-bitfit", "results": [{"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonCounterfactualClassification (en)", "type": "mteb/amazon_counterfactual", "config": "en", "split": "test", "revision": "2d8a100785abf0ae21420d2a55b0c56e3e1ea996"}, "metrics": [{"type": "accuracy", "value": 67.56716417910448}, {"type": "ap", "value": 30.75574629595259}, {"type": "f1", "value": 61.805121301858655}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonPolarityClassification", "type": "mteb/amazon_polarity", "config": "default", "split": "test", "revision": "80714f8dcf8cefc218ef4f8c5a966dd83f75a0e1"}, "metrics": [{"type": "accuracy", "value": 71.439575}, {"type": "ap", "value": 65.91341330532453}, {"type": "f1", "value": 70.90561852619555}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification (en)", "type": "mteb/amazon_reviews_multi", "config": "en", "split": "test", "revision": "c379a6705fec24a2493fa68e011692605f44e119"}, "metrics": [{"type": "accuracy", "value": 35.748000000000005}, {"type": "f1", "value": 35.48576287186347}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ArguAna", "type": "arguana", "config": "default", "split": "test", "revision": "5b3e3697907184a9b77a3c99ee9ea1a9cbb1e4e3"}, "metrics": [{"type": "map_at_1", "value": 25.96}, {"type": "map_at_10", "value": 41.619}, {"type": "map_at_100", "value": 42.673}, {"type": "map_at_1000", "value": 42.684}, {"type": "map_at_3", "value": 36.569}, {"type": "map_at_5", "value": 39.397}, {"type": "mrr_at_1", "value": 26.316}, {"type": "mrr_at_10", "value": 41.772}, {"type": "mrr_at_100", "value": 42.82}, {"type": "mrr_at_1000", "value": 42.83}, {"type": "mrr_at_3", "value": 36.724000000000004}, {"type": "mrr_at_5", "value": 39.528999999999996}, {"type": "ndcg_at_1", "value": 25.96}, {"type": "ndcg_at_10", "value": 50.491}, {"type": "ndcg_at_100", "value": 54.864999999999995}, {"type": "ndcg_at_1000", "value": 55.10699999999999}, {"type": "ndcg_at_3", "value": 40.053}, {"type": "ndcg_at_5", "value": 45.134}, {"type": "precision_at_1", "value": 25.96}, {"type": "precision_at_10", "value": 7.8950000000000005}, {"type": "precision_at_100", "value": 0.9780000000000001}, {"type": "precision_at_1000", "value": 0.1}, {"type": "precision_at_3", "value": 16.714000000000002}, {"type": "precision_at_5", "value": 12.489}, {"type": "recall_at_1", "value": 25.96}, {"type": "recall_at_10", "value": 78.947}, {"type": "recall_at_100", "value": 97.795}, {"type": "recall_at_1000", "value": 99.644}, {"type": "recall_at_3", "value": 50.141999999999996}, {"type": "recall_at_5", "value": 62.446999999999996}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringP2P", "type": "mteb/arxiv-clustering-p2p", "config": "default", "split": "test", "revision": "0bbdb47bcbe3a90093699aefeed338a0f28a7ee8"}, "metrics": [{"type": "v_measure", "value": 44.72125714642202}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringS2S", "type": "mteb/arxiv-clustering-s2s", "config": "default", "split": "test", "revision": "b73bd54100e5abfa6e3a23dcafb46fe4d2438dc3"}, "metrics": [{"type": "v_measure", "value": 35.081451519142064}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB AskUbuntuDupQuestions", "type": "mteb/askubuntudupquestions-reranking", "config": "default", "split": "test", "revision": "4d853f94cd57d85ec13805aeeac3ae3e5eb4c49c"}, "metrics": [{"type": "map", "value": 59.634661990392054}, {"type": "mrr", "value": 73.6813525040672}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB BIOSSES", "type": "mteb/biosses-sts", "config": "default", "split": "test", "revision": "9ee918f184421b6bd48b78f6c714d86546106103"}, "metrics": [{"type": "cos_sim_pearson", "value": 87.42754550496836}, {"type": "cos_sim_spearman", "value": 84.84289705838664}, {"type": "euclidean_pearson", "value": 85.59331970450859}, {"type": "euclidean_spearman", "value": 85.8525586184271}, {"type": "manhattan_pearson", "value": 85.41233134466698}, {"type": "manhattan_spearman", "value": 85.52303303767404}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB Banking77Classification", "type": "mteb/banking77", "config": "default", "split": "test", "revision": "44fa15921b4c889113cc5df03dd4901b49161ab7"}, "metrics": [{"type": "accuracy", "value": 83.21753246753246}, {"type": "f1", "value": 83.15394543120915}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringP2P", "type": "mteb/biorxiv-clustering-p2p", "config": "default", "split": "test", "revision": "11d0121201d1f1f280e8cc8f3d98fb9c4d9f9c55"}, "metrics": [{"type": "v_measure", "value": 34.41414219680629}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringS2S", "type": "mteb/biorxiv-clustering-s2s", "config": "default", "split": "test", "revision": "c0fab014e1bcb8d3a5e31b2088972a1e01547dc1"}, "metrics": [{"type": "v_measure", "value": 30.533275862270028}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackAndroidRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "2b9f5791698b5be7bc5e10535c8690f20043c3db"}, "metrics": [{"type": "map_at_1", "value": 30.808999999999997}, {"type": "map_at_10", "value": 40.617}, {"type": "map_at_100", "value": 41.894999999999996}, {"type": "map_at_1000", "value": 42.025}, {"type": "map_at_3", "value": 37.0}, {"type": "map_at_5", "value": 38.993}, {"type": "mrr_at_1", "value": 37.482}, {"type": "mrr_at_10", "value": 46.497}, {"type": "mrr_at_100", "value": 47.144000000000005}, {"type": "mrr_at_1000", "value": 47.189}, {"type": "mrr_at_3", "value": 43.705}, {"type": "mrr_at_5", "value": 45.193}, {"type": "ndcg_at_1", "value": 37.482}, {"type": "ndcg_at_10", "value": 46.688}, {"type": "ndcg_at_100", "value": 51.726000000000006}, {"type": "ndcg_at_1000", "value": 53.825}, {"type": "ndcg_at_3", "value": 41.242000000000004}, {"type": "ndcg_at_5", "value": 43.657000000000004}, {"type": "precision_at_1", "value": 37.482}, {"type": "precision_at_10", "value": 8.827}, {"type": "precision_at_100", "value": 1.393}, {"type": "precision_at_1000", "value": 0.186}, {"type": "precision_at_3", "value": 19.361}, {"type": "precision_at_5", "value": 14.106}, {"type": "recall_at_1", "value": 30.808999999999997}, {"type": "recall_at_10", "value": 58.47}, {"type": "recall_at_100", "value": 80.51899999999999}, {"type": "recall_at_1000", "value": 93.809}, {"type": "recall_at_3", "value": 42.462}, {"type": "recall_at_5", "value": 49.385}, {"type": "map_at_1", "value": 26.962000000000003}, {"type": "map_at_10", "value": 36.93}, {"type": "map_at_100", "value": 38.102000000000004}, {"type": "map_at_1000", "value": 38.22}, {"type": "map_at_3", "value": 34.065}, {"type": "map_at_5", "value": 35.72}, {"type": "mrr_at_1", "value": 33.567}, {"type": "mrr_at_10", "value": 42.269}, {"type": "mrr_at_100", "value": 42.99}, {"type": "mrr_at_1000", "value": 43.033}, {"type": "mrr_at_3", "value": 40.064}, {"type": "mrr_at_5", "value": 41.258}, {"type": "ndcg_at_1", "value": 33.567}, {"type": "ndcg_at_10", "value": 42.405}, {"type": "ndcg_at_100", "value": 46.847}, {"type": "ndcg_at_1000", "value": 48.951}, {"type": "ndcg_at_3", "value": 38.312000000000005}, {"type": "ndcg_at_5", "value": 40.242}, {"type": "precision_at_1", "value": 33.567}, {"type": "precision_at_10", "value": 8.032}, {"type": "precision_at_100", "value": 1.295}, {"type": "precision_at_1000", "value": 0.17600000000000002}, {"type": "precision_at_3", "value": 18.662}, {"type": "precision_at_5", "value": 13.299}, {"type": "recall_at_1", "value": 26.962000000000003}, {"type": "recall_at_10", "value": 52.489}, {"type": "recall_at_100", "value": 71.635}, {"type": "recall_at_1000", "value": 85.141}, {"type": "recall_at_3", "value": 40.28}, {"type": "recall_at_5", "value": 45.757}, {"type": "map_at_1", "value": 36.318}, {"type": "map_at_10", "value": 47.97}, {"type": "map_at_100", "value": 49.003}, {"type": "map_at_1000", "value": 49.065999999999995}, {"type": "map_at_3", "value": 45.031}, {"type": "map_at_5", "value": 46.633}, {"type": "mrr_at_1", "value": 41.504999999999995}, {"type": "mrr_at_10", "value": 51.431000000000004}, {"type": "mrr_at_100", "value": 52.129000000000005}, {"type": "mrr_at_1000", "value": 52.161}, {"type": "mrr_at_3", "value": 48.934}, {"type": "mrr_at_5", "value": 50.42}, {"type": "ndcg_at_1", "value": 41.504999999999995}, {"type": "ndcg_at_10", "value": 53.676}, {"type": "ndcg_at_100", "value": 57.867000000000004}, {"type": "ndcg_at_1000", "value": 59.166}, {"type": "ndcg_at_3", "value": 48.516}, {"type": "ndcg_at_5", "value": 50.983999999999995}, {"type": "precision_at_1", "value": 41.504999999999995}, {"type": "precision_at_10", "value": 8.608}, {"type": "precision_at_100", "value": 1.1560000000000001}, {"type": "precision_at_1000", "value": 0.133}, {"type": "precision_at_3", "value": 21.462999999999997}, {"type": "precision_at_5", "value": 14.721}, {"type": "recall_at_1", "value": 36.318}, {"type": "recall_at_10", "value": 67.066}, {"type": "recall_at_100", "value": 85.34}, {"type": "recall_at_1000", "value": 94.491}, {"type": "recall_at_3", "value": 53.215999999999994}, {"type": "recall_at_5", "value": 59.214}, {"type": "map_at_1", "value": 22.167}, {"type": "map_at_10", "value": 29.543999999999997}, {"type": "map_at_100", "value": 30.579}, {"type": "map_at_1000", "value": 30.669999999999998}, {"type": "map_at_3", "value": 26.982}, {"type": "map_at_5", "value": 28.474}, {"type": "mrr_at_1", "value": 24.068}, {"type": "mrr_at_10", "value": 31.237}, {"type": "mrr_at_100", "value": 32.222}, {"type": "mrr_at_1000", "value": 32.292}, {"type": "mrr_at_3", "value": 28.776000000000003}, {"type": "mrr_at_5", "value": 30.233999999999998}, {"type": "ndcg_at_1", "value": 24.068}, {"type": "ndcg_at_10", "value": 33.973}, {"type": "ndcg_at_100", "value": 39.135}, {"type": "ndcg_at_1000", "value": 41.443999999999996}, {"type": "ndcg_at_3", "value": 29.018}, {"type": "ndcg_at_5", "value": 31.558999999999997}, {"type": "precision_at_1", "value": 24.068}, {"type": "precision_at_10", "value": 5.299}, {"type": "precision_at_100", "value": 0.823}, {"type": "precision_at_1000", "value": 0.106}, {"type": "precision_at_3", "value": 12.166}, {"type": "precision_at_5", "value": 8.767999999999999}, {"type": "recall_at_1", "value": 22.167}, {"type": "recall_at_10", "value": 46.115}, {"type": "recall_at_100", "value": 69.867}, {"type": "recall_at_1000", "value": 87.234}, {"type": "recall_at_3", "value": 32.798}, {"type": "recall_at_5", "value": 38.951}, {"type": "map_at_1", "value": 12.033000000000001}, {"type": "map_at_10", "value": 19.314}, {"type": "map_at_100", "value": 20.562}, {"type": "map_at_1000", "value": 20.695}, {"type": "map_at_3", "value": 16.946}, {"type": "map_at_5", "value": 18.076999999999998}, {"type": "mrr_at_1", "value": 14.801}, {"type": "mrr_at_10", "value": 22.74}, {"type": "mrr_at_100", "value": 23.876}, {"type": "mrr_at_1000", "value": 23.949}, {"type": "mrr_at_3", "value": 20.211000000000002}, {"type": "mrr_at_5", "value": 21.573}, {"type": "ndcg_at_1", "value": 14.801}, {"type": "ndcg_at_10", "value": 24.038}, {"type": "ndcg_at_100", "value": 30.186}, {"type": "ndcg_at_1000", "value": 33.321}, {"type": "ndcg_at_3", "value": 19.431}, {"type": "ndcg_at_5", "value": 21.34}, {"type": "precision_at_1", "value": 14.801}, {"type": "precision_at_10", "value": 4.776}, {"type": "precision_at_100", "value": 0.897}, {"type": "precision_at_1000", "value": 0.133}, {"type": "precision_at_3", "value": 9.66}, {"type": "precision_at_5", "value": 7.239}, {"type": "recall_at_1", "value": 12.033000000000001}, {"type": "recall_at_10", "value": 35.098}, {"type": "recall_at_100", "value": 62.175000000000004}, {"type": "recall_at_1000", "value": 84.17099999999999}, {"type": "recall_at_3", "value": 22.61}, {"type": "recall_at_5", "value": 27.278999999999996}, {"type": "map_at_1", "value": 26.651000000000003}, {"type": "map_at_10", "value": 36.901}, {"type": "map_at_100", "value": 38.249}, {"type": "map_at_1000", "value": 38.361000000000004}, {"type": "map_at_3", "value": 33.891}, {"type": "map_at_5", "value": 35.439}, {"type": "mrr_at_1", "value": 32.724}, {"type": "mrr_at_10", "value": 42.504}, {"type": "mrr_at_100", "value": 43.391999999999996}, {"type": "mrr_at_1000", "value": 43.436}, {"type": "mrr_at_3", "value": 39.989999999999995}, {"type": "mrr_at_5", "value": 41.347}, {"type": "ndcg_at_1", "value": 32.724}, {"type": "ndcg_at_10", "value": 43.007}, {"type": "ndcg_at_100", "value": 48.601}, {"type": "ndcg_at_1000", "value": 50.697}, {"type": "ndcg_at_3", "value": 37.99}, {"type": "ndcg_at_5", "value": 40.083999999999996}, {"type": "precision_at_1", "value": 32.724}, {"type": "precision_at_10", "value": 7.872999999999999}, {"type": "precision_at_100", "value": 1.247}, {"type": "precision_at_1000", "value": 0.16199999999999998}, {"type": "precision_at_3", "value": 18.062}, {"type": "precision_at_5", "value": 12.666}, {"type": "recall_at_1", "value": 26.651000000000003}, {"type": "recall_at_10", "value": 55.674}, {"type": "recall_at_100", "value": 78.904}, {"type": "recall_at_1000", "value": 92.55799999999999}, {"type": "recall_at_3", "value": 41.36}, {"type": "recall_at_5", "value": 46.983999999999995}, {"type": "map_at_1", "value": 22.589000000000002}, {"type": "map_at_10", "value": 32.244}, {"type": "map_at_100", "value": 33.46}, {"type": "map_at_1000", "value": 33.593}, {"type": "map_at_3", "value": 29.21}, {"type": "map_at_5", "value": 31.019999999999996}, {"type": "mrr_at_1", "value": 28.425}, {"type": "mrr_at_10", "value": 37.282}, {"type": "mrr_at_100", "value": 38.187}, {"type": "mrr_at_1000", "value": 38.248}, {"type": "mrr_at_3", "value": 34.684}, {"type": "mrr_at_5", "value": 36.123}, {"type": "ndcg_at_1", "value": 28.425}, {"type": "ndcg_at_10", "value": 37.942}, {"type": "ndcg_at_100", "value": 43.443}, {"type": "ndcg_at_1000", "value": 45.995999999999995}, {"type": "ndcg_at_3", "value": 32.873999999999995}, {"type": "ndcg_at_5", "value": 35.325}, {"type": "precision_at_1", "value": 28.425}, {"type": "precision_at_10", "value": 7.1}, {"type": "precision_at_100", "value": 1.166}, {"type": "precision_at_1000", "value": 0.158}, {"type": "precision_at_3", "value": 16.02}, {"type": "precision_at_5", "value": 11.644}, {"type": "recall_at_1", "value": 22.589000000000002}, {"type": "recall_at_10", "value": 50.03999999999999}, {"type": "recall_at_100", "value": 73.973}, {"type": "recall_at_1000", "value": 91.128}, {"type": "recall_at_3", "value": 35.882999999999996}, {"type": "recall_at_5", "value": 42.187999999999995}, {"type": "map_at_1", "value": 23.190833333333334}, {"type": "map_at_10", "value": 31.504916666666666}, {"type": "map_at_100", "value": 32.64908333333334}, {"type": "map_at_1000", "value": 32.77075}, {"type": "map_at_3", "value": 28.82575}, {"type": "map_at_5", "value": 30.2755}, {"type": "mrr_at_1", "value": 27.427499999999995}, {"type": "mrr_at_10", "value": 35.36483333333334}, {"type": "mrr_at_100", "value": 36.23441666666666}, {"type": "mrr_at_1000", "value": 36.297583333333336}, {"type": "mrr_at_3", "value": 32.97966666666667}, {"type": "mrr_at_5", "value": 34.294583333333335}, {"type": "ndcg_at_1", "value": 27.427499999999995}, {"type": "ndcg_at_10", "value": 36.53358333333333}, {"type": "ndcg_at_100", "value": 41.64508333333333}, {"type": "ndcg_at_1000", "value": 44.14499999999999}, {"type": "ndcg_at_3", "value": 31.88908333333333}, {"type": "ndcg_at_5", "value": 33.98433333333333}, {"type": "precision_at_1", "value": 27.427499999999995}, {"type": "precision_at_10", "value": 6.481083333333333}, {"type": "precision_at_100", "value": 1.0610833333333334}, {"type": "precision_at_1000", "value": 0.14691666666666667}, {"type": "precision_at_3", "value": 14.656749999999999}, {"type": "precision_at_5", "value": 10.493583333333332}, {"type": "recall_at_1", "value": 23.190833333333334}, {"type": "recall_at_10", "value": 47.65175}, {"type": "recall_at_100", "value": 70.41016666666667}, {"type": "recall_at_1000", "value": 87.82708333333332}, {"type": "recall_at_3", "value": 34.637583333333325}, {"type": "recall_at_5", "value": 40.05008333333333}, {"type": "map_at_1", "value": 20.409}, {"type": "map_at_10", "value": 26.794}, {"type": "map_at_100", "value": 27.682000000000002}, {"type": "map_at_1000", "value": 27.783}, {"type": "map_at_3", "value": 24.461}, {"type": "map_at_5", "value": 25.668000000000003}, {"type": "mrr_at_1", "value": 22.853}, {"type": "mrr_at_10", "value": 29.296}, {"type": "mrr_at_100", "value": 30.103}, {"type": "mrr_at_1000", "value": 30.179000000000002}, {"type": "mrr_at_3", "value": 27.173000000000002}, {"type": "mrr_at_5", "value": 28.223}, {"type": "ndcg_at_1", "value": 22.853}, {"type": "ndcg_at_10", "value": 31.007}, {"type": "ndcg_at_100", "value": 35.581}, {"type": "ndcg_at_1000", "value": 38.147}, {"type": "ndcg_at_3", "value": 26.590999999999998}, {"type": "ndcg_at_5", "value": 28.43}, {"type": "precision_at_1", "value": 22.853}, {"type": "precision_at_10", "value": 5.031}, {"type": "precision_at_100", "value": 0.7939999999999999}, {"type": "precision_at_1000", "value": 0.11}, {"type": "precision_at_3", "value": 11.401}, {"type": "precision_at_5", "value": 8.16}, {"type": "recall_at_1", "value": 20.409}, {"type": "recall_at_10", "value": 41.766}, {"type": "recall_at_100", "value": 62.964}, {"type": "recall_at_1000", "value": 81.682}, {"type": "recall_at_3", "value": 29.281000000000002}, {"type": "recall_at_5", "value": 33.83}, {"type": "map_at_1", "value": 14.549000000000001}, {"type": "map_at_10", "value": 20.315}, {"type": "map_at_100", "value": 21.301000000000002}, {"type": "map_at_1000", "value": 21.425}, {"type": "map_at_3", "value": 18.132}, {"type": "map_at_5", "value": 19.429}, {"type": "mrr_at_1", "value": 17.86}, {"type": "mrr_at_10", "value": 23.860999999999997}, {"type": "mrr_at_100", "value": 24.737000000000002}, {"type": "mrr_at_1000", "value": 24.82}, {"type": "mrr_at_3", "value": 21.685}, {"type": "mrr_at_5", "value": 23.008}, {"type": "ndcg_at_1", "value": 17.86}, {"type": "ndcg_at_10", "value": 24.396}, {"type": "ndcg_at_100", "value": 29.328}, {"type": "ndcg_at_1000", "value": 32.486}, {"type": "ndcg_at_3", "value": 20.375}, {"type": "ndcg_at_5", "value": 22.411}, {"type": "precision_at_1", "value": 17.86}, {"type": "precision_at_10", "value": 4.47}, {"type": "precision_at_100", "value": 0.8099999999999999}, {"type": "precision_at_1000", "value": 0.125}, {"type": "precision_at_3", "value": 9.475}, {"type": "precision_at_5", "value": 7.170999999999999}, {"type": "recall_at_1", "value": 14.549000000000001}, {"type": "recall_at_10", "value": 33.365}, {"type": "recall_at_100", "value": 55.797}, {"type": "recall_at_1000", "value": 78.632}, {"type": "recall_at_3", "value": 22.229}, {"type": "recall_at_5", "value": 27.339000000000002}, {"type": "map_at_1", "value": 23.286}, {"type": "map_at_10", "value": 30.728}, {"type": "map_at_100", "value": 31.840000000000003}, {"type": "map_at_1000", "value": 31.953}, {"type": "map_at_3", "value": 28.302}, {"type": "map_at_5", "value": 29.615000000000002}, {"type": "mrr_at_1", "value": 27.239}, {"type": "mrr_at_10", "value": 34.408}, {"type": "mrr_at_100", "value": 35.335}, {"type": "mrr_at_1000", "value": 35.405}, {"type": "mrr_at_3", "value": 32.151999999999994}, {"type": "mrr_at_5", "value": 33.355000000000004}, {"type": "ndcg_at_1", "value": 27.239}, {"type": "ndcg_at_10", "value": 35.324}, {"type": "ndcg_at_100", "value": 40.866}, {"type": "ndcg_at_1000", "value": 43.584}, {"type": "ndcg_at_3", "value": 30.898999999999997}, {"type": "ndcg_at_5", "value": 32.812999999999995}, {"type": "precision_at_1", "value": 27.239}, {"type": "precision_at_10", "value": 5.896}, {"type": "precision_at_100", "value": 0.979}, {"type": "precision_at_1000", "value": 0.133}, {"type": "precision_at_3", "value": 13.713000000000001}, {"type": "precision_at_5", "value": 9.683}, {"type": "recall_at_1", "value": 23.286}, {"type": "recall_at_10", "value": 45.711}, {"type": "recall_at_100", "value": 70.611}, {"type": "recall_at_1000", "value": 90.029}, {"type": "recall_at_3", "value": 33.615}, {"type": "recall_at_5", "value": 38.41}, {"type": "map_at_1", "value": 23.962}, {"type": "map_at_10", "value": 31.942999999999998}, {"type": "map_at_100", "value": 33.384}, {"type": "map_at_1000", "value": 33.611000000000004}, {"type": "map_at_3", "value": 29.243000000000002}, {"type": "map_at_5", "value": 30.446}, {"type": "mrr_at_1", "value": 28.458}, {"type": "mrr_at_10", "value": 36.157000000000004}, {"type": "mrr_at_100", "value": 37.092999999999996}, {"type": "mrr_at_1000", "value": 37.163000000000004}, {"type": "mrr_at_3", "value": 33.86}, {"type": "mrr_at_5", "value": 35.086}, {"type": "ndcg_at_1", "value": 28.458}, {"type": "ndcg_at_10", "value": 37.201}, {"type": "ndcg_at_100", "value": 42.591}, {"type": "ndcg_at_1000", "value": 45.539}, {"type": "ndcg_at_3", "value": 32.889}, {"type": "ndcg_at_5", "value": 34.483000000000004}, {"type": "precision_at_1", "value": 28.458}, {"type": "precision_at_10", "value": 7.332}, {"type": "precision_at_100", "value": 1.437}, {"type": "precision_at_1000", "value": 0.233}, {"type": "precision_at_3", "value": 15.547}, {"type": "precision_at_5", "value": 11.146}, {"type": "recall_at_1", "value": 23.962}, {"type": "recall_at_10", "value": 46.751}, {"type": "recall_at_100", "value": 71.626}, {"type": "recall_at_1000", "value": 90.93900000000001}, {"type": "recall_at_3", "value": 34.138000000000005}, {"type": "recall_at_5", "value": 38.673}, {"type": "map_at_1", "value": 18.555}, {"type": "map_at_10", "value": 24.759}, {"type": "map_at_100", "value": 25.732}, {"type": "map_at_1000", "value": 25.846999999999998}, {"type": "map_at_3", "value": 22.646}, {"type": "map_at_5", "value": 23.791999999999998}, {"type": "mrr_at_1", "value": 20.148}, {"type": "mrr_at_10", "value": 26.695999999999998}, {"type": "mrr_at_100", "value": 27.605}, {"type": "mrr_at_1000", "value": 27.695999999999998}, {"type": "mrr_at_3", "value": 24.522}, {"type": "mrr_at_5", "value": 25.715}, {"type": "ndcg_at_1", "value": 20.148}, {"type": "ndcg_at_10", "value": 28.746}, {"type": "ndcg_at_100", "value": 33.57}, {"type": "ndcg_at_1000", "value": 36.584}, {"type": "ndcg_at_3", "value": 24.532}, {"type": "ndcg_at_5", "value": 26.484}, {"type": "precision_at_1", "value": 20.148}, {"type": "precision_at_10", "value": 4.529}, {"type": "precision_at_100", "value": 0.736}, {"type": "precision_at_1000", "value": 0.108}, {"type": "precision_at_3", "value": 10.351}, {"type": "precision_at_5", "value": 7.32}, {"type": "recall_at_1", "value": 18.555}, {"type": "recall_at_10", "value": 39.275999999999996}, {"type": "recall_at_100", "value": 61.511}, {"type": "recall_at_1000", "value": 84.111}, {"type": "recall_at_3", "value": 27.778999999999996}, {"type": "recall_at_5", "value": 32.591}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ClimateFEVER", "type": "climate-fever", "config": "default", "split": "test", "revision": "392b78eb68c07badcd7c2cd8f39af108375dfcce"}, "metrics": [{"type": "map_at_1", "value": 10.366999999999999}, {"type": "map_at_10", "value": 18.953999999999997}, {"type": "map_at_100", "value": 20.674999999999997}, {"type": "map_at_1000", "value": 20.868000000000002}, {"type": "map_at_3", "value": 15.486}, {"type": "map_at_5", "value": 17.347}, {"type": "mrr_at_1", "value": 23.257}, {"type": "mrr_at_10", "value": 35.419}, {"type": "mrr_at_100", "value": 36.361}, {"type": "mrr_at_1000", "value": 36.403}, {"type": "mrr_at_3", "value": 31.747999999999998}, {"type": "mrr_at_5", "value": 34.077}, {"type": "ndcg_at_1", "value": 23.257}, {"type": "ndcg_at_10", "value": 27.11}, {"type": "ndcg_at_100", "value": 33.981}, {"type": "ndcg_at_1000", "value": 37.444}, {"type": "ndcg_at_3", "value": 21.471999999999998}, {"type": "ndcg_at_5", "value": 23.769000000000002}, {"type": "precision_at_1", "value": 23.257}, {"type": "precision_at_10", "value": 8.704}, {"type": "precision_at_100", "value": 1.606}, {"type": "precision_at_1000", "value": 0.22499999999999998}, {"type": "precision_at_3", "value": 16.287}, {"type": "precision_at_5", "value": 13.068}, {"type": "recall_at_1", "value": 10.366999999999999}, {"type": "recall_at_10", "value": 33.706}, {"type": "recall_at_100", "value": 57.375}, {"type": "recall_at_1000", "value": 76.79}, {"type": "recall_at_3", "value": 20.18}, {"type": "recall_at_5", "value": 26.215}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB DBPedia", "type": "dbpedia-entity", "config": "default", "split": "test", "revision": "f097057d03ed98220bc7309ddb10b71a54d667d6"}, "metrics": [{"type": "map_at_1", "value": 8.246}, {"type": "map_at_10", "value": 15.979}, {"type": "map_at_100", "value": 21.025}, {"type": "map_at_1000", "value": 22.189999999999998}, {"type": "map_at_3", "value": 11.997}, {"type": "map_at_5", "value": 13.697000000000001}, {"type": "mrr_at_1", "value": 60.75000000000001}, {"type": "mrr_at_10", "value": 68.70100000000001}, {"type": "mrr_at_100", "value": 69.1}, {"type": "mrr_at_1000", "value": 69.111}, {"type": "mrr_at_3", "value": 66.583}, {"type": "mrr_at_5", "value": 67.87100000000001}, {"type": "ndcg_at_1", "value": 49.75}, {"type": "ndcg_at_10", "value": 34.702}, {"type": "ndcg_at_100", "value": 37.607}, {"type": "ndcg_at_1000", "value": 44.322}, {"type": "ndcg_at_3", "value": 39.555}, {"type": "ndcg_at_5", "value": 36.684}, {"type": "precision_at_1", "value": 60.75000000000001}, {"type": "precision_at_10", "value": 26.625}, {"type": "precision_at_100", "value": 7.969999999999999}, {"type": "precision_at_1000", "value": 1.678}, {"type": "precision_at_3", "value": 41.833}, {"type": "precision_at_5", "value": 34.5}, {"type": "recall_at_1", "value": 8.246}, {"type": "recall_at_10", "value": 20.968}, {"type": "recall_at_100", "value": 42.065000000000005}, {"type": "recall_at_1000", "value": 63.671}, {"type": "recall_at_3", "value": 13.039000000000001}, {"type": "recall_at_5", "value": 16.042}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB EmotionClassification", "type": "mteb/emotion", "config": "default", "split": "test", "revision": "829147f8f75a25f005913200eb5ed41fae320aa1"}, "metrics": [{"type": "accuracy", "value": 49.214999999999996}, {"type": "f1", "value": 44.85952451163755}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FEVER", "type": "fever", "config": "default", "split": "test", "revision": "1429cf27e393599b8b359b9b72c666f96b2525f9"}, "metrics": [{"type": "map_at_1", "value": 56.769000000000005}, {"type": "map_at_10", "value": 67.30199999999999}, {"type": "map_at_100", "value": 67.692}, {"type": "map_at_1000", "value": 67.712}, {"type": "map_at_3", "value": 65.346}, {"type": "map_at_5", "value": 66.574}, {"type": "mrr_at_1", "value": 61.370999999999995}, {"type": "mrr_at_10", "value": 71.875}, {"type": "mrr_at_100", "value": 72.195}, {"type": "mrr_at_1000", "value": 72.206}, {"type": "mrr_at_3", "value": 70.04}, {"type": "mrr_at_5", "value": 71.224}, {"type": "ndcg_at_1", "value": 61.370999999999995}, {"type": "ndcg_at_10", "value": 72.731}, {"type": "ndcg_at_100", "value": 74.468}, {"type": "ndcg_at_1000", "value": 74.91600000000001}, {"type": "ndcg_at_3", "value": 69.077}, {"type": "ndcg_at_5", "value": 71.111}, {"type": "precision_at_1", "value": 61.370999999999995}, {"type": "precision_at_10", "value": 9.325999999999999}, {"type": "precision_at_100", "value": 1.03}, {"type": "precision_at_1000", "value": 0.108}, {"type": "precision_at_3", "value": 27.303}, {"type": "precision_at_5", "value": 17.525}, {"type": "recall_at_1", "value": 56.769000000000005}, {"type": "recall_at_10", "value": 85.06}, {"type": "recall_at_100", "value": 92.767}, {"type": "recall_at_1000", "value": 95.933}, {"type": "recall_at_3", "value": 75.131}, {"type": "recall_at_5", "value": 80.17}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FiQA2018", "type": "fiqa", "config": "default", "split": "test", "revision": "41b686a7f28c59bcaaa5791efd47c67c8ebe28be"}, "metrics": [{"type": "map_at_1", "value": 15.753}, {"type": "map_at_10", "value": 25.875999999999998}, {"type": "map_at_100", "value": 27.415}, {"type": "map_at_1000", "value": 27.590999999999998}, {"type": "map_at_3", "value": 22.17}, {"type": "map_at_5", "value": 24.236}, {"type": "mrr_at_1", "value": 31.019000000000002}, {"type": "mrr_at_10", "value": 39.977000000000004}, {"type": "mrr_at_100", "value": 40.788999999999994}, {"type": "mrr_at_1000", "value": 40.832}, {"type": "mrr_at_3", "value": 37.088}, {"type": "mrr_at_5", "value": 38.655}, {"type": "ndcg_at_1", "value": 31.019000000000002}, {"type": "ndcg_at_10", "value": 33.286}, {"type": "ndcg_at_100", "value": 39.528999999999996}, {"type": "ndcg_at_1000", "value": 42.934}, {"type": "ndcg_at_3", "value": 29.29}, {"type": "ndcg_at_5", "value": 30.615}, {"type": "precision_at_1", "value": 31.019000000000002}, {"type": "precision_at_10", "value": 9.383}, {"type": "precision_at_100", "value": 1.6019999999999999}, {"type": "precision_at_1000", "value": 0.22200000000000003}, {"type": "precision_at_3", "value": 19.753}, {"type": "precision_at_5", "value": 14.815000000000001}, {"type": "recall_at_1", "value": 15.753}, {"type": "recall_at_10", "value": 40.896}, {"type": "recall_at_100", "value": 64.443}, {"type": "recall_at_1000", "value": 85.218}, {"type": "recall_at_3", "value": 26.526}, {"type": "recall_at_5", "value": 32.452999999999996}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB HotpotQA", "type": "hotpotqa", "config": "default", "split": "test", "revision": "766870b35a1b9ca65e67a0d1913899973551fc6c"}, "metrics": [{"type": "map_at_1", "value": 32.153999999999996}, {"type": "map_at_10", "value": 43.651}, {"type": "map_at_100", "value": 44.41}, {"type": "map_at_1000", "value": 44.487}, {"type": "map_at_3", "value": 41.239}, {"type": "map_at_5", "value": 42.659000000000006}, {"type": "mrr_at_1", "value": 64.30799999999999}, {"type": "mrr_at_10", "value": 71.22500000000001}, {"type": "mrr_at_100", "value": 71.57}, {"type": "mrr_at_1000", "value": 71.59100000000001}, {"type": "mrr_at_3", "value": 69.95}, {"type": "mrr_at_5", "value": 70.738}, {"type": "ndcg_at_1", "value": 64.30799999999999}, {"type": "ndcg_at_10", "value": 52.835}, {"type": "ndcg_at_100", "value": 55.840999999999994}, {"type": "ndcg_at_1000", "value": 57.484}, {"type": "ndcg_at_3", "value": 49.014}, {"type": "ndcg_at_5", "value": 51.01599999999999}, {"type": "precision_at_1", "value": 64.30799999999999}, {"type": "precision_at_10", "value": 10.77}, {"type": "precision_at_100", "value": 1.315}, {"type": "precision_at_1000", "value": 0.153}, {"type": "precision_at_3", "value": 30.223}, {"type": "precision_at_5", "value": 19.716}, {"type": "recall_at_1", "value": 32.153999999999996}, {"type": "recall_at_10", "value": 53.849000000000004}, {"type": "recall_at_100", "value": 65.75999999999999}, {"type": "recall_at_1000", "value": 76.705}, {"type": "recall_at_3", "value": 45.334}, {"type": "recall_at_5", "value": 49.291000000000004}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ImdbClassification", "type": "mteb/imdb", "config": "default", "split": "test", "revision": "8d743909f834c38949e8323a8a6ce8721ea6c7f4"}, "metrics": [{"type": "accuracy", "value": 63.5316}, {"type": "ap", "value": 58.90084300359825}, {"type": "f1", "value": 63.35727889030892}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MSMARCO", "type": "msmarco", "config": "default", "split": "validation", "revision": "e6838a846e2408f22cf5cc337ebc83e0bcf77849"}, "metrics": [{"type": "map_at_1", "value": 20.566000000000003}, {"type": "map_at_10", "value": 32.229}, {"type": "map_at_100", "value": 33.445}, {"type": "map_at_1000", "value": 33.501}, {"type": "map_at_3", "value": 28.504}, {"type": "map_at_5", "value": 30.681000000000004}, {"type": "mrr_at_1", "value": 21.218}, {"type": "mrr_at_10", "value": 32.816}, {"type": "mrr_at_100", "value": 33.986}, {"type": "mrr_at_1000", "value": 34.035}, {"type": "mrr_at_3", "value": 29.15}, {"type": "mrr_at_5", "value": 31.290000000000003}, {"type": "ndcg_at_1", "value": 21.218}, {"type": "ndcg_at_10", "value": 38.832}, {"type": "ndcg_at_100", "value": 44.743}, {"type": "ndcg_at_1000", "value": 46.138}, {"type": "ndcg_at_3", "value": 31.232}, {"type": "ndcg_at_5", "value": 35.099999999999994}, {"type": "precision_at_1", "value": 21.218}, {"type": "precision_at_10", "value": 6.186}, {"type": "precision_at_100", "value": 0.914}, {"type": "precision_at_1000", "value": 0.10300000000000001}, {"type": "precision_at_3", "value": 13.314}, {"type": "precision_at_5", "value": 9.943}, {"type": "recall_at_1", "value": 20.566000000000003}, {"type": "recall_at_10", "value": 59.192}, {"type": "recall_at_100", "value": 86.626}, {"type": "recall_at_1000", "value": 97.283}, {"type": "recall_at_3", "value": 38.492}, {"type": "recall_at_5", "value": 47.760000000000005}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPDomainClassification (en)", "type": "mteb/mtop_domain", "config": "en", "split": "test", "revision": "a7e2a951126a26fc8c6a69f835f33a346ba259e3"}, "metrics": [{"type": "accuracy", "value": 92.56269949840402}, {"type": "f1", "value": 92.1020975473988}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPIntentClassification (en)", "type": "mteb/mtop_intent", "config": "en", "split": "test", "revision": "6299947a7777084cc2d4b64235bf7190381ce755"}, "metrics": [{"type": "accuracy", "value": 71.8467852257182}, {"type": "f1", "value": 53.652719348592015}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (en)", "type": "mteb/amazon_massive_intent", "config": "en", "split": "test", "revision": "072a486a144adf7f4479a4a0dddb2152e161e1ea"}, "metrics": [{"type": "accuracy", "value": 69.00806993947546}, {"type": "f1", "value": 67.41429618885515}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (en)", "type": "mteb/amazon_massive_scenario", "config": "en", "split": "test", "revision": "7d571f92784cd94a019292a1f45445077d0ef634"}, "metrics": [{"type": "accuracy", "value": 75.90114324142569}, {"type": "f1", "value": 76.25183590651454}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringP2P", "type": "mteb/medrxiv-clustering-p2p", "config": "default", "split": "test", "revision": "dcefc037ef84348e49b0d29109e891c01067226b"}, "metrics": [{"type": "v_measure", "value": 31.350109978273395}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringS2S", "type": "mteb/medrxiv-clustering-s2s", "config": "default", "split": "test", "revision": "3cd0e71dfbe09d4de0f9e5ecba43e7ce280959dc"}, "metrics": [{"type": "v_measure", "value": 28.768923695767327}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB MindSmallReranking", "type": "mteb/mind_small", "config": "default", "split": "test", "revision": "3bdac13927fdc888b903db93b2ffdbd90b295a69"}, "metrics": [{"type": "map", "value": 31.716396735210754}, {"type": "mrr", "value": 32.88970538547634}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NFCorpus", "type": "nfcorpus", "config": "default", "split": "test", "revision": "7eb63cc0c1eb59324d709ebed25fcab851fa7610"}, "metrics": [{"type": "map_at_1", "value": 5.604}, {"type": "map_at_10", "value": 12.379999999999999}, {"type": "map_at_100", "value": 15.791}, {"type": "map_at_1000", "value": 17.327}, {"type": "map_at_3", "value": 9.15}, {"type": "map_at_5", "value": 10.599}, {"type": "mrr_at_1", "value": 45.201}, {"type": "mrr_at_10", "value": 53.374}, {"type": "mrr_at_100", "value": 54.089}, {"type": "mrr_at_1000", "value": 54.123}, {"type": "mrr_at_3", "value": 51.44499999999999}, {"type": "mrr_at_5", "value": 52.59}, {"type": "ndcg_at_1", "value": 42.879}, {"type": "ndcg_at_10", "value": 33.891}, {"type": "ndcg_at_100", "value": 31.391999999999996}, {"type": "ndcg_at_1000", "value": 40.36}, {"type": "ndcg_at_3", "value": 39.076}, {"type": "ndcg_at_5", "value": 37.047000000000004}, {"type": "precision_at_1", "value": 44.582}, {"type": "precision_at_10", "value": 25.294}, {"type": "precision_at_100", "value": 8.285}, {"type": "precision_at_1000", "value": 2.1479999999999997}, {"type": "precision_at_3", "value": 36.120000000000005}, {"type": "precision_at_5", "value": 31.95}, {"type": "recall_at_1", "value": 5.604}, {"type": "recall_at_10", "value": 16.239}, {"type": "recall_at_100", "value": 32.16}, {"type": "recall_at_1000", "value": 64.513}, {"type": "recall_at_3", "value": 10.406}, {"type": "recall_at_5", "value": 12.684999999999999}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NQ", "type": "nq", "config": "default", "split": "test", "revision": "6062aefc120bfe8ece5897809fb2e53bfe0d128c"}, "metrics": [{"type": "map_at_1", "value": 25.881}, {"type": "map_at_10", "value": 39.501}, {"type": "map_at_100", "value": 40.615}, {"type": "map_at_1000", "value": 40.661}, {"type": "map_at_3", "value": 35.559000000000005}, {"type": "map_at_5", "value": 37.773}, {"type": "mrr_at_1", "value": 29.229}, {"type": "mrr_at_10", "value": 41.955999999999996}, {"type": "mrr_at_100", "value": 42.86}, {"type": "mrr_at_1000", "value": 42.893}, {"type": "mrr_at_3", "value": 38.562000000000005}, {"type": "mrr_at_5", "value": 40.542}, {"type": "ndcg_at_1", "value": 29.2}, {"type": "ndcg_at_10", "value": 46.703}, {"type": "ndcg_at_100", "value": 51.644}, {"type": "ndcg_at_1000", "value": 52.771}, {"type": "ndcg_at_3", "value": 39.141999999999996}, {"type": "ndcg_at_5", "value": 42.892}, {"type": "precision_at_1", "value": 29.2}, {"type": "precision_at_10", "value": 7.920000000000001}, {"type": "precision_at_100", "value": 1.0659999999999998}, {"type": "precision_at_1000", "value": 0.117}, {"type": "precision_at_3", "value": 18.105}, {"type": "precision_at_5", "value": 13.036}, {"type": "recall_at_1", "value": 25.881}, {"type": "recall_at_10", "value": 66.266}, {"type": "recall_at_100", "value": 88.116}, {"type": "recall_at_1000", "value": 96.58200000000001}, {"type": "recall_at_3", "value": 46.526}, {"type": "recall_at_5", "value": 55.154}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB QuoraRetrieval", "type": "quora", "config": "default", "split": "test", "revision": "6205996560df11e3a3da9ab4f926788fc30a7db4"}, "metrics": [{"type": "map_at_1", "value": 67.553}, {"type": "map_at_10", "value": 81.34}, {"type": "map_at_100", "value": 82.002}, {"type": "map_at_1000", "value": 82.027}, {"type": "map_at_3", "value": 78.281}, {"type": "map_at_5", "value": 80.149}, {"type": "mrr_at_1", "value": 77.72}, {"type": "mrr_at_10", "value": 84.733}, {"type": "mrr_at_100", "value": 84.878}, {"type": "mrr_at_1000", "value": 84.879}, {"type": "mrr_at_3", "value": 83.587}, {"type": "mrr_at_5", "value": 84.32600000000001}, {"type": "ndcg_at_1", "value": 77.75}, {"type": "ndcg_at_10", "value": 85.603}, {"type": "ndcg_at_100", "value": 87.069}, {"type": "ndcg_at_1000", "value": 87.25}, {"type": "ndcg_at_3", "value": 82.303}, {"type": "ndcg_at_5", "value": 84.03699999999999}, {"type": "precision_at_1", "value": 77.75}, {"type": "precision_at_10", "value": 13.04}, {"type": "precision_at_100", "value": 1.5070000000000001}, {"type": "precision_at_1000", "value": 0.156}, {"type": "precision_at_3", "value": 35.903}, {"type": "precision_at_5", "value": 23.738}, {"type": "recall_at_1", "value": 67.553}, {"type": "recall_at_10", "value": 93.903}, {"type": "recall_at_100", "value": 99.062}, {"type": "recall_at_1000", "value": 99.935}, {"type": "recall_at_3", "value": 84.58099999999999}, {"type": "recall_at_5", "value": 89.316}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClustering", "type": "mteb/reddit-clustering", "config": "default", "split": "test", "revision": "b2805658ae38990172679479369a78b86de8c390"}, "metrics": [{"type": "v_measure", "value": 46.46887711230235}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClusteringP2P", "type": "mteb/reddit-clustering-p2p", "config": "default", "split": "test", "revision": "385e3cb46b4cfa89021f56c4380204149d0efe33"}, "metrics": [{"type": "v_measure", "value": 54.166876298246926}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SCIDOCS", "type": "scidocs", "config": "default", "split": "test", "revision": "5c59ef3e437a0a9651c8fe6fde943e7dce59fba5"}, "metrics": [{"type": "map_at_1", "value": 4.053}, {"type": "map_at_10", "value": 9.693999999999999}, {"type": "map_at_100", "value": 11.387}, {"type": "map_at_1000", "value": 11.654}, {"type": "map_at_3", "value": 7.053}, {"type": "map_at_5", "value": 8.439}, {"type": "mrr_at_1", "value": 19.900000000000002}, {"type": "mrr_at_10", "value": 29.359}, {"type": "mrr_at_100", "value": 30.484}, {"type": "mrr_at_1000", "value": 30.553}, {"type": "mrr_at_3", "value": 26.200000000000003}, {"type": "mrr_at_5", "value": 28.115000000000002}, {"type": "ndcg_at_1", "value": 19.900000000000002}, {"type": "ndcg_at_10", "value": 16.575}, {"type": "ndcg_at_100", "value": 23.655}, {"type": "ndcg_at_1000", "value": 28.853}, {"type": "ndcg_at_3", "value": 15.848}, {"type": "ndcg_at_5", "value": 14.026}, {"type": "precision_at_1", "value": 19.900000000000002}, {"type": "precision_at_10", "value": 8.450000000000001}, {"type": "precision_at_100", "value": 1.872}, {"type": "precision_at_1000", "value": 0.313}, {"type": "precision_at_3", "value": 14.667}, {"type": "precision_at_5", "value": 12.32}, {"type": "recall_at_1", "value": 4.053}, {"type": "recall_at_10", "value": 17.169999999999998}, {"type": "recall_at_100", "value": 38.025}, {"type": "recall_at_1000", "value": 63.571999999999996}, {"type": "recall_at_3", "value": 8.903}, {"type": "recall_at_5", "value": 12.477}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB SICK-R", "type": "mteb/sickr-sts", "config": "default", "split": "test", "revision": "20a6d6f312dd54037fe07a32d58e5e168867909d"}, "metrics": [{"type": "cos_sim_pearson", "value": 77.7548748519677}, {"type": "cos_sim_spearman", "value": 68.19926431966059}, {"type": "euclidean_pearson", "value": 71.69016204991725}, {"type": "euclidean_spearman", "value": 66.98099673026834}, {"type": "manhattan_pearson", "value": 71.62994072488664}, {"type": "manhattan_spearman", "value": 67.03435950744577}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS12", "type": "mteb/sts12-sts", "config": "default", "split": "test", "revision": "fdf84275bb8ce4b49c971d02e84dd1abc677a50f"}, "metrics": [{"type": "cos_sim_pearson", "value": 75.91051402657887}, {"type": "cos_sim_spearman", "value": 66.99390786191645}, {"type": "euclidean_pearson", "value": 71.54128036454578}, {"type": "euclidean_spearman", "value": 69.25605675649068}, {"type": "manhattan_pearson", "value": 71.60981030780171}, {"type": "manhattan_spearman", "value": 69.27513670128046}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS13", "type": "mteb/sts13-sts", "config": "default", "split": "test", "revision": "1591bfcbe8c69d4bf7fe2a16e2451017832cafb9"}, "metrics": [{"type": "cos_sim_pearson", "value": 77.23835466417793}, {"type": "cos_sim_spearman", "value": 77.57623085766706}, {"type": "euclidean_pearson", "value": 77.5090992200725}, {"type": "euclidean_spearman", "value": 77.88601688144924}, {"type": "manhattan_pearson", "value": 77.39045060647423}, {"type": "manhattan_spearman", "value": 77.77552718279098}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS14", "type": "mteb/sts14-sts", "config": "default", "split": "test", "revision": "e2125984e7df8b7871f6ae9949cf6b6795e7c54b"}, "metrics": [{"type": "cos_sim_pearson", "value": 77.91692485139602}, {"type": "cos_sim_spearman", "value": 72.78258293483495}, {"type": "euclidean_pearson", "value": 74.64773017077789}, {"type": "euclidean_spearman", "value": 71.81662299104619}, {"type": "manhattan_pearson", "value": 74.71043337995533}, {"type": "manhattan_spearman", "value": 71.83960860845646}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS15", "type": "mteb/sts15-sts", "config": "default", "split": "test", "revision": "1cd7298cac12a96a373b6a2f18738bb3e739a9b6"}, "metrics": [{"type": "cos_sim_pearson", "value": 82.13422113617578}, {"type": "cos_sim_spearman", "value": 82.61707296911949}, {"type": "euclidean_pearson", "value": 81.42487480400861}, {"type": "euclidean_spearman", "value": 82.17970991273835}, {"type": "manhattan_pearson", "value": 81.41985055477845}, {"type": "manhattan_spearman", "value": 82.15823204362937}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS16", "type": "mteb/sts16-sts", "config": "default", "split": "test", "revision": "360a0b2dff98700d09e634a01e1cc1624d3e42cd"}, "metrics": [{"type": "cos_sim_pearson", "value": 79.07989542843826}, {"type": "cos_sim_spearman", "value": 80.09839524406284}, {"type": "euclidean_pearson", "value": 76.43186028364195}, {"type": "euclidean_spearman", "value": 76.76720323266471}, {"type": "manhattan_pearson", "value": 76.4674747409161}, {"type": "manhattan_spearman", "value": 76.81797407068667}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS17 (en-en)", "type": "mteb/sts17-crosslingual-sts", "config": "en-en", "split": "test", "revision": "9fc37e8c632af1c87a3d23e685d49552a02582a0"}, "metrics": [{"type": "cos_sim_pearson", "value": 87.0420983224933}, {"type": "cos_sim_spearman", "value": 87.25017540413702}, {"type": "euclidean_pearson", "value": 84.56384596473421}, {"type": "euclidean_spearman", "value": 84.72557417564886}, {"type": "manhattan_pearson", "value": 84.7329954474549}, {"type": "manhattan_spearman", "value": 84.75071371008909}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (en)", "type": "mteb/sts22-crosslingual-sts", "config": "en", "split": "test", "revision": "2de6ce8c1921b71a755b262c6b57fef195dd7906"}, "metrics": [{"type": "cos_sim_pearson", "value": 68.47031320016424}, {"type": "cos_sim_spearman", "value": 68.7486910762485}, {"type": "euclidean_pearson", "value": 71.30330985913915}, {"type": "euclidean_spearman", "value": 71.59666258520735}, {"type": "manhattan_pearson", "value": 71.4423884279027}, {"type": "manhattan_spearman", "value": 71.67460706861044}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSBenchmark", "type": "mteb/stsbenchmark-sts", "config": "default", "split": "test", "revision": "8913289635987208e6e7c72789e4be2fe94b6abd"}, "metrics": [{"type": "cos_sim_pearson", "value": 80.79514366062675}, {"type": "cos_sim_spearman", "value": 79.20585637461048}, {"type": "euclidean_pearson", "value": 78.6591557395699}, {"type": "euclidean_spearman", "value": 77.86455794285718}, {"type": "manhattan_pearson", "value": 78.67754806486865}, {"type": "manhattan_spearman", "value": 77.88178687200732}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB SciDocsRR", "type": "mteb/scidocs-reranking", "config": "default", "split": "test", "revision": "56a6d0140cf6356659e2a7c1413286a774468d44"}, "metrics": [{"type": "map", "value": 77.71580844366375}, {"type": "mrr", "value": 93.04215845882513}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SciFact", "type": "scifact", "config": "default", "split": "test", "revision": "a75ae049398addde9b70f6b268875f5cbce99089"}, "metrics": [{"type": "map_at_1", "value": 56.39999999999999}, {"type": "map_at_10", "value": 65.701}, {"type": "map_at_100", "value": 66.32000000000001}, {"type": "map_at_1000", "value": 66.34100000000001}, {"type": "map_at_3", "value": 62.641999999999996}, {"type": "map_at_5", "value": 64.342}, {"type": "mrr_at_1", "value": 58.667}, {"type": "mrr_at_10", "value": 66.45299999999999}, {"type": "mrr_at_100", "value": 66.967}, {"type": "mrr_at_1000", "value": 66.988}, {"type": "mrr_at_3", "value": 64.11099999999999}, {"type": "mrr_at_5", "value": 65.411}, {"type": "ndcg_at_1", "value": 58.667}, {"type": "ndcg_at_10", "value": 70.165}, {"type": "ndcg_at_100", "value": 72.938}, {"type": "ndcg_at_1000", "value": 73.456}, {"type": "ndcg_at_3", "value": 64.79}, {"type": "ndcg_at_5", "value": 67.28}, {"type": "precision_at_1", "value": 58.667}, {"type": "precision_at_10", "value": 9.4}, {"type": "precision_at_100", "value": 1.087}, {"type": "precision_at_1000", "value": 0.11299999999999999}, {"type": "precision_at_3", "value": 24.889}, {"type": "precision_at_5", "value": 16.667}, {"type": "recall_at_1", "value": 56.39999999999999}, {"type": "recall_at_10", "value": 83.122}, {"type": "recall_at_100", "value": 95.667}, {"type": "recall_at_1000", "value": 99.667}, {"type": "recall_at_3", "value": 68.378}, {"type": "recall_at_5", "value": 74.68299999999999}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB SprintDuplicateQuestions", "type": "mteb/sprintduplicatequestions-pairclassification", "config": "default", "split": "test", "revision": "5a8256d0dff9c4bd3be3ba3e67e4e70173f802ea"}, "metrics": [{"type": "cos_sim_accuracy", "value": 99.76831683168317}, {"type": "cos_sim_ap", "value": 93.47124923047998}, {"type": "cos_sim_f1", "value": 88.06122448979592}, {"type": "cos_sim_precision", "value": 89.89583333333333}, {"type": "cos_sim_recall", "value": 86.3}, {"type": "dot_accuracy", "value": 99.57326732673268}, {"type": "dot_ap", "value": 84.06577868167207}, {"type": "dot_f1", "value": 77.82629791363416}, {"type": "dot_precision", "value": 75.58906691800189}, {"type": "dot_recall", "value": 80.2}, {"type": "euclidean_accuracy", "value": 99.74257425742574}, {"type": "euclidean_ap", "value": 92.1904681653555}, {"type": "euclidean_f1", "value": 86.74821610601427}, {"type": "euclidean_precision", "value": 88.46153846153845}, {"type": "euclidean_recall", "value": 85.1}, {"type": "manhattan_accuracy", "value": 99.74554455445545}, {"type": "manhattan_ap", "value": 92.4337790809948}, {"type": "manhattan_f1", "value": 86.86765457332653}, {"type": "manhattan_precision", "value": 88.81922675026124}, {"type": "manhattan_recall", "value": 85.0}, {"type": "max_accuracy", "value": 99.76831683168317}, {"type": "max_ap", "value": 93.47124923047998}, {"type": "max_f1", "value": 88.06122448979592}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClustering", "type": "mteb/stackexchange-clustering", "config": "default", "split": "test", "revision": "70a89468f6dccacc6aa2b12a6eac54e74328f235"}, "metrics": [{"type": "v_measure", "value": 59.194098673976484}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClusteringP2P", "type": "mteb/stackexchange-clustering-p2p", "config": "default", "split": "test", "revision": "d88009ab563dd0b16cfaf4436abaf97fa3550cf0"}, "metrics": [{"type": "v_measure", "value": 32.5744032578115}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB StackOverflowDupQuestions", "type": "mteb/stackoverflowdupquestions-reranking", "config": "default", "split": "test", "revision": "ef807ea29a75ec4f91b50fd4191cb4ee4589a9f9"}, "metrics": [{"type": "map", "value": 49.61186384154483}, {"type": "mrr", "value": 50.55424253034547}]}, {"task": {"type": "Summarization"}, "dataset": {"name": "MTEB SummEval", "type": "mteb/summeval", "config": "default", "split": "test", "revision": "8753c2788d36c01fc6f05d03fe3f7268d63f9122"}, "metrics": [{"type": "cos_sim_pearson", "value": 30.027210161713946}, {"type": "cos_sim_spearman", "value": 31.030178065751734}, {"type": "dot_pearson", "value": 30.09179785685587}, {"type": "dot_spearman", "value": 30.408303252207812}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB TRECCOVID", "type": "trec-covid", "config": "default", "split": "test", "revision": "2c8041b2c07a79b6f7ba8fe6acc72e5d9f92d217"}, "metrics": [{"type": "map_at_1", "value": 0.22300000000000003}, {"type": "map_at_10", "value": 1.762}, {"type": "map_at_100", "value": 9.984}, {"type": "map_at_1000", "value": 24.265}, {"type": "map_at_3", "value": 0.631}, {"type": "map_at_5", "value": 0.9950000000000001}, {"type": "mrr_at_1", "value": 88.0}, {"type": "mrr_at_10", "value": 92.833}, {"type": "mrr_at_100", "value": 92.833}, {"type": "mrr_at_1000", "value": 92.833}, {"type": "mrr_at_3", "value": 92.333}, {"type": "mrr_at_5", "value": 92.833}, {"type": "ndcg_at_1", "value": 83.0}, {"type": "ndcg_at_10", "value": 75.17}, {"type": "ndcg_at_100", "value": 55.432}, {"type": "ndcg_at_1000", "value": 49.482}, {"type": "ndcg_at_3", "value": 82.184}, {"type": "ndcg_at_5", "value": 79.712}, {"type": "precision_at_1", "value": 88.0}, {"type": "precision_at_10", "value": 78.60000000000001}, {"type": "precision_at_100", "value": 56.56}, {"type": "precision_at_1000", "value": 22.334}, {"type": "precision_at_3", "value": 86.667}, {"type": "precision_at_5", "value": 83.6}, {"type": "recall_at_1", "value": 0.22300000000000003}, {"type": "recall_at_10", "value": 1.9879999999999998}, {"type": "recall_at_100", "value": 13.300999999999998}, {"type": "recall_at_1000", "value": 46.587}, {"type": "recall_at_3", "value": 0.6629999999999999}, {"type": "recall_at_5", "value": 1.079}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB Touche2020", "type": "webis-touche2020", "config": "default", "split": "test", "revision": "527b7d77e16e343303e68cb6af11d6e18b9f7b3b"}, "metrics": [{"type": "map_at_1", "value": 3.047}, {"type": "map_at_10", "value": 8.792}, {"type": "map_at_100", "value": 14.631}, {"type": "map_at_1000", "value": 16.127}, {"type": "map_at_3", "value": 4.673}, {"type": "map_at_5", "value": 5.897}, {"type": "mrr_at_1", "value": 38.775999999999996}, {"type": "mrr_at_10", "value": 49.271}, {"type": "mrr_at_100", "value": 50.181}, {"type": "mrr_at_1000", "value": 50.2}, {"type": "mrr_at_3", "value": 44.558}, {"type": "mrr_at_5", "value": 47.925000000000004}, {"type": "ndcg_at_1", "value": 35.714}, {"type": "ndcg_at_10", "value": 23.44}, {"type": "ndcg_at_100", "value": 35.345}, {"type": "ndcg_at_1000", "value": 46.495}, {"type": "ndcg_at_3", "value": 26.146}, {"type": "ndcg_at_5", "value": 24.878}, {"type": "precision_at_1", "value": 38.775999999999996}, {"type": "precision_at_10", "value": 20.816000000000003}, {"type": "precision_at_100", "value": 7.428999999999999}, {"type": "precision_at_1000", "value": 1.494}, {"type": "precision_at_3", "value": 25.85}, {"type": "precision_at_5", "value": 24.082}, {"type": "recall_at_1", "value": 3.047}, {"type": "recall_at_10", "value": 14.975}, {"type": "recall_at_100", "value": 45.943}, {"type": "recall_at_1000", "value": 80.31099999999999}, {"type": "recall_at_3", "value": 5.478000000000001}, {"type": "recall_at_5", "value": 8.294}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ToxicConversationsClassification", "type": "mteb/toxic_conversations_50k", "config": "default", "split": "test", "revision": "edfaf9da55d3dd50d43143d90c1ac476895ae6de"}, "metrics": [{"type": "accuracy", "value": 68.84080000000002}, {"type": "ap", "value": 13.135219251019848}, {"type": "f1", "value": 52.849999421995506}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB TweetSentimentExtractionClassification", "type": "mteb/tweet_sentiment_extraction", "config": "default", "split": "test", "revision": "62146448f05be9e52a36b8ee9936447ea787eede"}, "metrics": [{"type": "accuracy", "value": 56.68647425014149}, {"type": "f1", "value": 56.97981427365949}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB TwentyNewsgroupsClustering", "type": "mteb/twentynewsgroups-clustering", "config": "default", "split": "test", "revision": "091a54f9a36281ce7d6590ec8c75dd485e7e01d4"}, "metrics": [{"type": "v_measure", "value": 40.8911707239219}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterSemEval2015", "type": "mteb/twittersemeval2015-pairclassification", "config": "default", "split": "test", "revision": "70970daeab8776df92f5ea462b6173c0b46fd2d1"}, "metrics": [{"type": "cos_sim_accuracy", "value": 83.04226023722954}, {"type": "cos_sim_ap", "value": 63.681339908301325}, {"type": "cos_sim_f1", "value": 60.349184470480125}, {"type": "cos_sim_precision", "value": 53.437754271765655}, {"type": "cos_sim_recall", "value": 69.31398416886545}, {"type": "dot_accuracy", "value": 81.46271681468677}, {"type": "dot_ap", "value": 57.78072296265885}, {"type": "dot_f1", "value": 56.28769265132901}, {"type": "dot_precision", "value": 48.7993803253292}, {"type": "dot_recall", "value": 66.49076517150397}, {"type": "euclidean_accuracy", "value": 82.16606067830959}, {"type": "euclidean_ap", "value": 59.974530371203514}, {"type": "euclidean_f1", "value": 56.856023506366306}, {"type": "euclidean_precision", "value": 53.037916857012334}, {"type": "euclidean_recall", "value": 61.2664907651715}, {"type": "manhattan_accuracy", "value": 82.16606067830959}, {"type": "manhattan_ap", "value": 59.98962379571767}, {"type": "manhattan_f1", "value": 56.98153158451947}, {"type": "manhattan_precision", "value": 51.41158989598811}, {"type": "manhattan_recall", "value": 63.90501319261214}, {"type": "max_accuracy", "value": 83.04226023722954}, {"type": "max_ap", "value": 63.681339908301325}, {"type": "max_f1", "value": 60.349184470480125}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterURLCorpus", "type": "mteb/twitterurlcorpus-pairclassification", "config": "default", "split": "test", "revision": "8b6510b0b1fa4e4c4f879467980e9be563ec1cdf"}, "metrics": [{"type": "cos_sim_accuracy", "value": 88.56871191834517}, {"type": "cos_sim_ap", "value": 84.80240716354544}, {"type": "cos_sim_f1", "value": 77.07765285922385}, {"type": "cos_sim_precision", "value": 74.84947406601378}, {"type": "cos_sim_recall", "value": 79.44256236526024}, {"type": "dot_accuracy", "value": 86.00923662048356}, {"type": "dot_ap", "value": 78.6556459012073}, {"type": "dot_f1", "value": 72.7583749109052}, {"type": "dot_precision", "value": 67.72823779193206}, {"type": "dot_recall", "value": 78.59562673236834}, {"type": "euclidean_accuracy", "value": 87.84103698529127}, {"type": "euclidean_ap", "value": 83.50424424952834}, {"type": "euclidean_f1", "value": 75.74496544549307}, {"type": "euclidean_precision", "value": 73.19402556369381}, {"type": "euclidean_recall", "value": 78.48013550970127}, {"type": "manhattan_accuracy", "value": 87.9225365777933}, {"type": "manhattan_ap", "value": 83.49479248597825}, {"type": "manhattan_f1", "value": 75.67748162447101}, {"type": "manhattan_precision", "value": 73.06810035842294}, {"type": "manhattan_recall", "value": 78.48013550970127}, {"type": "max_accuracy", "value": 88.56871191834517}, {"type": "max_ap", "value": 84.80240716354544}, {"type": "max_f1", "value": 77.07765285922385}]}]}]} |
Severian/nomic | Severian | feature-extraction | [
"sentence-transformers",
"nomic_bert",
"feature-extraction",
"sentence-similarity",
"mteb",
"transformers",
"transformers.js",
"custom_code",
"en",
"arxiv:2402.01613",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| 2024-02-08T11:07:27 | 2024-02-08T11:08:45 | 9 | 0 | ---
language:
- en
library_name: sentence-transformers
license: apache-2.0
pipeline_tag: feature-extraction
tags:
- feature-extraction
- sentence-similarity
- mteb
- transformers
- transformers.js
model-index:
- name: epoch_0_model
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 76.8507462686567
- type: ap
value: 40.592189159090495
- type: f1
value: 71.01634655512476
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 91.51892500000001
- type: ap
value: 88.50346762975335
- type: f1
value: 91.50342077459624
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 47.364
- type: f1
value: 46.72708080922794
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 25.178
- type: map_at_10
value: 40.244
- type: map_at_100
value: 41.321999999999996
- type: map_at_1000
value: 41.331
- type: map_at_3
value: 35.016999999999996
- type: map_at_5
value: 37.99
- type: mrr_at_1
value: 25.605
- type: mrr_at_10
value: 40.422000000000004
- type: mrr_at_100
value: 41.507
- type: mrr_at_1000
value: 41.516
- type: mrr_at_3
value: 35.23
- type: mrr_at_5
value: 38.15
- type: ndcg_at_1
value: 25.178
- type: ndcg_at_10
value: 49.258
- type: ndcg_at_100
value: 53.776
- type: ndcg_at_1000
value: 53.995000000000005
- type: ndcg_at_3
value: 38.429
- type: ndcg_at_5
value: 43.803
- type: precision_at_1
value: 25.178
- type: precision_at_10
value: 7.831
- type: precision_at_100
value: 0.979
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 16.121
- type: precision_at_5
value: 12.29
- type: recall_at_1
value: 25.178
- type: recall_at_10
value: 78.307
- type: recall_at_100
value: 97.866
- type: recall_at_1000
value: 99.57300000000001
- type: recall_at_3
value: 48.364000000000004
- type: recall_at_5
value: 61.451
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 45.93034494751465
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 36.64579480054327
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 60.601310529222054
- type: mrr
value: 75.04484896451656
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 88.57797718095814
- type: cos_sim_spearman
value: 86.47064499110101
- type: euclidean_pearson
value: 87.4559602783142
- type: euclidean_spearman
value: 86.47064499110101
- type: manhattan_pearson
value: 87.7232764230245
- type: manhattan_spearman
value: 86.91222131777742
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 84.5422077922078
- type: f1
value: 84.47657456950589
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 38.48953561974464
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 32.75995857510105
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 30.008000000000003
- type: map_at_10
value: 39.51
- type: map_at_100
value: 40.841
- type: map_at_1000
value: 40.973
- type: map_at_3
value: 36.248999999999995
- type: map_at_5
value: 38.096999999999994
- type: mrr_at_1
value: 36.481
- type: mrr_at_10
value: 44.818000000000005
- type: mrr_at_100
value: 45.64
- type: mrr_at_1000
value: 45.687
- type: mrr_at_3
value: 42.036
- type: mrr_at_5
value: 43.782
- type: ndcg_at_1
value: 36.481
- type: ndcg_at_10
value: 45.152
- type: ndcg_at_100
value: 50.449
- type: ndcg_at_1000
value: 52.76499999999999
- type: ndcg_at_3
value: 40.161
- type: ndcg_at_5
value: 42.577999999999996
- type: precision_at_1
value: 36.481
- type: precision_at_10
value: 8.369
- type: precision_at_100
value: 1.373
- type: precision_at_1000
value: 0.186
- type: precision_at_3
value: 18.693
- type: precision_at_5
value: 13.533999999999999
- type: recall_at_1
value: 30.008000000000003
- type: recall_at_10
value: 56.108999999999995
- type: recall_at_100
value: 78.55499999999999
- type: recall_at_1000
value: 93.659
- type: recall_at_3
value: 41.754999999999995
- type: recall_at_5
value: 48.296
- type: map_at_1
value: 30.262
- type: map_at_10
value: 40.139
- type: map_at_100
value: 41.394
- type: map_at_1000
value: 41.526
- type: map_at_3
value: 37.155
- type: map_at_5
value: 38.785
- type: mrr_at_1
value: 38.153
- type: mrr_at_10
value: 46.369
- type: mrr_at_100
value: 47.072
- type: mrr_at_1000
value: 47.111999999999995
- type: mrr_at_3
value: 44.268
- type: mrr_at_5
value: 45.389
- type: ndcg_at_1
value: 38.153
- type: ndcg_at_10
value: 45.925
- type: ndcg_at_100
value: 50.394000000000005
- type: ndcg_at_1000
value: 52.37500000000001
- type: ndcg_at_3
value: 41.754000000000005
- type: ndcg_at_5
value: 43.574
- type: precision_at_1
value: 38.153
- type: precision_at_10
value: 8.796
- type: precision_at_100
value: 1.432
- type: precision_at_1000
value: 0.189
- type: precision_at_3
value: 20.318
- type: precision_at_5
value: 14.395
- type: recall_at_1
value: 30.262
- type: recall_at_10
value: 55.72200000000001
- type: recall_at_100
value: 74.97500000000001
- type: recall_at_1000
value: 87.342
- type: recall_at_3
value: 43.129
- type: recall_at_5
value: 48.336
- type: map_at_1
value: 39.951
- type: map_at_10
value: 51.248000000000005
- type: map_at_100
value: 52.188
- type: map_at_1000
value: 52.247
- type: map_at_3
value: 48.211
- type: map_at_5
value: 49.797000000000004
- type: mrr_at_1
value: 45.329
- type: mrr_at_10
value: 54.749
- type: mrr_at_100
value: 55.367999999999995
- type: mrr_at_1000
value: 55.400000000000006
- type: mrr_at_3
value: 52.382
- type: mrr_at_5
value: 53.649
- type: ndcg_at_1
value: 45.329
- type: ndcg_at_10
value: 56.847
- type: ndcg_at_100
value: 60.738
- type: ndcg_at_1000
value: 61.976
- type: ndcg_at_3
value: 51.59
- type: ndcg_at_5
value: 53.915
- type: precision_at_1
value: 45.329
- type: precision_at_10
value: 8.959
- type: precision_at_100
value: 1.187
- type: precision_at_1000
value: 0.134
- type: precision_at_3
value: 22.612
- type: precision_at_5
value: 15.273
- type: recall_at_1
value: 39.951
- type: recall_at_10
value: 70.053
- type: recall_at_100
value: 86.996
- type: recall_at_1000
value: 95.707
- type: recall_at_3
value: 56.032000000000004
- type: recall_at_5
value: 61.629999999999995
- type: map_at_1
value: 25.566
- type: map_at_10
value: 33.207
- type: map_at_100
value: 34.166000000000004
- type: map_at_1000
value: 34.245
- type: map_at_3
value: 30.94
- type: map_at_5
value: 32.01
- type: mrr_at_1
value: 27.345000000000002
- type: mrr_at_10
value: 35.193000000000005
- type: mrr_at_100
value: 35.965
- type: mrr_at_1000
value: 36.028999999999996
- type: mrr_at_3
value: 32.806000000000004
- type: mrr_at_5
value: 34.021
- type: ndcg_at_1
value: 27.345000000000002
- type: ndcg_at_10
value: 37.891999999999996
- type: ndcg_at_100
value: 42.664
- type: ndcg_at_1000
value: 44.757000000000005
- type: ndcg_at_3
value: 33.123000000000005
- type: ndcg_at_5
value: 35.035
- type: precision_at_1
value: 27.345000000000002
- type: precision_at_10
value: 5.763
- type: precision_at_100
value: 0.859
- type: precision_at_1000
value: 0.108
- type: precision_at_3
value: 13.71
- type: precision_at_5
value: 9.401
- type: recall_at_1
value: 25.566
- type: recall_at_10
value: 50.563
- type: recall_at_100
value: 72.86399999999999
- type: recall_at_1000
value: 88.68599999999999
- type: recall_at_3
value: 37.43
- type: recall_at_5
value: 41.894999999999996
- type: map_at_1
value: 16.663
- type: map_at_10
value: 23.552
- type: map_at_100
value: 24.538
- type: map_at_1000
value: 24.661
- type: map_at_3
value: 21.085
- type: map_at_5
value: 22.391
- type: mrr_at_1
value: 20.025000000000002
- type: mrr_at_10
value: 27.643
- type: mrr_at_100
value: 28.499999999999996
- type: mrr_at_1000
value: 28.582
- type: mrr_at_3
value: 25.083
- type: mrr_at_5
value: 26.544
- type: ndcg_at_1
value: 20.025000000000002
- type: ndcg_at_10
value: 28.272000000000002
- type: ndcg_at_100
value: 33.353
- type: ndcg_at_1000
value: 36.454
- type: ndcg_at_3
value: 23.579
- type: ndcg_at_5
value: 25.685000000000002
- type: precision_at_1
value: 20.025000000000002
- type: precision_at_10
value: 5.187
- type: precision_at_100
value: 0.897
- type: precision_at_1000
value: 0.13
- type: precision_at_3
value: 10.987
- type: precision_at_5
value: 8.06
- type: recall_at_1
value: 16.663
- type: recall_at_10
value: 38.808
- type: recall_at_100
value: 61.305
- type: recall_at_1000
value: 83.571
- type: recall_at_3
value: 25.907999999999998
- type: recall_at_5
value: 31.214
- type: map_at_1
value: 27.695999999999998
- type: map_at_10
value: 37.018
- type: map_at_100
value: 38.263000000000005
- type: map_at_1000
value: 38.371
- type: map_at_3
value: 34.226
- type: map_at_5
value: 35.809999999999995
- type: mrr_at_1
value: 32.916000000000004
- type: mrr_at_10
value: 42.067
- type: mrr_at_100
value: 42.925000000000004
- type: mrr_at_1000
value: 42.978
- type: mrr_at_3
value: 39.637
- type: mrr_at_5
value: 41.134
- type: ndcg_at_1
value: 32.916000000000004
- type: ndcg_at_10
value: 42.539
- type: ndcg_at_100
value: 47.873
- type: ndcg_at_1000
value: 50.08200000000001
- type: ndcg_at_3
value: 37.852999999999994
- type: ndcg_at_5
value: 40.201
- type: precision_at_1
value: 32.916000000000004
- type: precision_at_10
value: 7.5840000000000005
- type: precision_at_100
value: 1.199
- type: precision_at_1000
value: 0.155
- type: precision_at_3
value: 17.485
- type: precision_at_5
value: 12.512
- type: recall_at_1
value: 27.695999999999998
- type: recall_at_10
value: 53.638
- type: recall_at_100
value: 76.116
- type: recall_at_1000
value: 91.069
- type: recall_at_3
value: 41.13
- type: recall_at_5
value: 46.872
- type: map_at_1
value: 24.108
- type: map_at_10
value: 33.372
- type: map_at_100
value: 34.656
- type: map_at_1000
value: 34.768
- type: map_at_3
value: 30.830999999999996
- type: map_at_5
value: 32.204
- type: mrr_at_1
value: 29.110000000000003
- type: mrr_at_10
value: 37.979
- type: mrr_at_100
value: 38.933
- type: mrr_at_1000
value: 38.988
- type: mrr_at_3
value: 35.731
- type: mrr_at_5
value: 36.963
- type: ndcg_at_1
value: 29.110000000000003
- type: ndcg_at_10
value: 38.635000000000005
- type: ndcg_at_100
value: 44.324999999999996
- type: ndcg_at_1000
value: 46.747
- type: ndcg_at_3
value: 34.37
- type: ndcg_at_5
value: 36.228
- type: precision_at_1
value: 29.110000000000003
- type: precision_at_10
value: 6.963
- type: precision_at_100
value: 1.146
- type: precision_at_1000
value: 0.152
- type: precision_at_3
value: 16.400000000000002
- type: precision_at_5
value: 11.552999999999999
- type: recall_at_1
value: 24.108
- type: recall_at_10
value: 49.597
- type: recall_at_100
value: 73.88900000000001
- type: recall_at_1000
value: 90.62400000000001
- type: recall_at_3
value: 37.662
- type: recall_at_5
value: 42.565
- type: map_at_1
value: 25.00791666666667
- type: map_at_10
value: 33.287749999999996
- type: map_at_100
value: 34.41141666666667
- type: map_at_1000
value: 34.52583333333333
- type: map_at_3
value: 30.734416666666668
- type: map_at_5
value: 32.137166666666666
- type: mrr_at_1
value: 29.305666666666664
- type: mrr_at_10
value: 37.22966666666666
- type: mrr_at_100
value: 38.066583333333334
- type: mrr_at_1000
value: 38.12616666666667
- type: mrr_at_3
value: 34.92275
- type: mrr_at_5
value: 36.23333333333334
- type: ndcg_at_1
value: 29.305666666666664
- type: ndcg_at_10
value: 38.25533333333333
- type: ndcg_at_100
value: 43.25266666666666
- type: ndcg_at_1000
value: 45.63583333333334
- type: ndcg_at_3
value: 33.777166666666666
- type: ndcg_at_5
value: 35.85
- type: precision_at_1
value: 29.305666666666664
- type: precision_at_10
value: 6.596416666666667
- type: precision_at_100
value: 1.0784166666666668
- type: precision_at_1000
value: 0.14666666666666664
- type: precision_at_3
value: 15.31075
- type: precision_at_5
value: 10.830916666666667
- type: recall_at_1
value: 25.00791666666667
- type: recall_at_10
value: 49.10933333333333
- type: recall_at_100
value: 71.09216666666667
- type: recall_at_1000
value: 87.77725000000001
- type: recall_at_3
value: 36.660916666666665
- type: recall_at_5
value: 41.94149999999999
- type: map_at_1
value: 23.521
- type: map_at_10
value: 30.043
- type: map_at_100
value: 30.936000000000003
- type: map_at_1000
value: 31.022
- type: map_at_3
value: 27.926000000000002
- type: map_at_5
value: 29.076999999999998
- type: mrr_at_1
value: 26.227
- type: mrr_at_10
value: 32.822
- type: mrr_at_100
value: 33.61
- type: mrr_at_1000
value: 33.672000000000004
- type: mrr_at_3
value: 30.776999999999997
- type: mrr_at_5
value: 31.866
- type: ndcg_at_1
value: 26.227
- type: ndcg_at_10
value: 34.041
- type: ndcg_at_100
value: 38.394
- type: ndcg_at_1000
value: 40.732
- type: ndcg_at_3
value: 30.037999999999997
- type: ndcg_at_5
value: 31.845000000000002
- type: precision_at_1
value: 26.227
- type: precision_at_10
value: 5.244999999999999
- type: precision_at_100
value: 0.808
- type: precision_at_1000
value: 0.107
- type: precision_at_3
value: 12.679000000000002
- type: precision_at_5
value: 8.773
- type: recall_at_1
value: 23.521
- type: recall_at_10
value: 43.633
- type: recall_at_100
value: 63.126000000000005
- type: recall_at_1000
value: 80.765
- type: recall_at_3
value: 32.614
- type: recall_at_5
value: 37.15
- type: map_at_1
value: 16.236
- type: map_at_10
value: 22.898
- type: map_at_100
value: 23.878
- type: map_at_1000
value: 24.009
- type: map_at_3
value: 20.87
- type: map_at_5
value: 22.025
- type: mrr_at_1
value: 19.339000000000002
- type: mrr_at_10
value: 26.382
- type: mrr_at_100
value: 27.245
- type: mrr_at_1000
value: 27.33
- type: mrr_at_3
value: 24.386
- type: mrr_at_5
value: 25.496000000000002
- type: ndcg_at_1
value: 19.339000000000002
- type: ndcg_at_10
value: 27.139999999999997
- type: ndcg_at_100
value: 31.944
- type: ndcg_at_1000
value: 35.077999999999996
- type: ndcg_at_3
value: 23.424
- type: ndcg_at_5
value: 25.188
- type: precision_at_1
value: 19.339000000000002
- type: precision_at_10
value: 4.8309999999999995
- type: precision_at_100
value: 0.845
- type: precision_at_1000
value: 0.128
- type: precision_at_3
value: 10.874
- type: precision_at_5
value: 7.825
- type: recall_at_1
value: 16.236
- type: recall_at_10
value: 36.513
- type: recall_at_100
value: 57.999
- type: recall_at_1000
value: 80.512
- type: recall_at_3
value: 26.179999999999996
- type: recall_at_5
value: 30.712
- type: map_at_1
value: 24.11
- type: map_at_10
value: 31.566
- type: map_at_100
value: 32.647
- type: map_at_1000
value: 32.753
- type: map_at_3
value: 29.24
- type: map_at_5
value: 30.564999999999998
- type: mrr_at_1
value: 28.265
- type: mrr_at_10
value: 35.504000000000005
- type: mrr_at_100
value: 36.436
- type: mrr_at_1000
value: 36.503
- type: mrr_at_3
value: 33.349000000000004
- type: mrr_at_5
value: 34.622
- type: ndcg_at_1
value: 28.265
- type: ndcg_at_10
value: 36.192
- type: ndcg_at_100
value: 41.388000000000005
- type: ndcg_at_1000
value: 43.948
- type: ndcg_at_3
value: 31.959
- type: ndcg_at_5
value: 33.998
- type: precision_at_1
value: 28.265
- type: precision_at_10
value: 5.989
- type: precision_at_100
value: 0.9650000000000001
- type: precision_at_1000
value: 0.13
- type: precision_at_3
value: 14.335
- type: precision_at_5
value: 10.112
- type: recall_at_1
value: 24.11
- type: recall_at_10
value: 46.418
- type: recall_at_100
value: 69.314
- type: recall_at_1000
value: 87.397
- type: recall_at_3
value: 34.724
- type: recall_at_5
value: 39.925
- type: map_at_1
value: 22.091
- type: map_at_10
value: 29.948999999999998
- type: map_at_100
value: 31.502000000000002
- type: map_at_1000
value: 31.713
- type: map_at_3
value: 27.464
- type: map_at_5
value: 28.968
- type: mrr_at_1
value: 26.482
- type: mrr_at_10
value: 34.009
- type: mrr_at_100
value: 35.081
- type: mrr_at_1000
value: 35.138000000000005
- type: mrr_at_3
value: 31.785000000000004
- type: mrr_at_5
value: 33.178999999999995
- type: ndcg_at_1
value: 26.482
- type: ndcg_at_10
value: 35.008
- type: ndcg_at_100
value: 41.272999999999996
- type: ndcg_at_1000
value: 43.972
- type: ndcg_at_3
value: 30.804
- type: ndcg_at_5
value: 33.046
- type: precision_at_1
value: 26.482
- type: precision_at_10
value: 6.462
- type: precision_at_100
value: 1.431
- type: precision_at_1000
value: 0.22899999999999998
- type: precision_at_3
value: 14.360999999999999
- type: precision_at_5
value: 10.474
- type: recall_at_1
value: 22.091
- type: recall_at_10
value: 45.125
- type: recall_at_100
value: 72.313
- type: recall_at_1000
value: 89.503
- type: recall_at_3
value: 33.158
- type: recall_at_5
value: 39.086999999999996
- type: map_at_1
value: 19.883
- type: map_at_10
value: 26.951000000000004
- type: map_at_100
value: 27.927999999999997
- type: map_at_1000
value: 28.022000000000002
- type: map_at_3
value: 24.616
- type: map_at_5
value: 25.917
- type: mrr_at_1
value: 21.996
- type: mrr_at_10
value: 29.221000000000004
- type: mrr_at_100
value: 30.024
- type: mrr_at_1000
value: 30.095
- type: mrr_at_3
value: 26.833000000000002
- type: mrr_at_5
value: 28.155
- type: ndcg_at_1
value: 21.996
- type: ndcg_at_10
value: 31.421
- type: ndcg_at_100
value: 36.237
- type: ndcg_at_1000
value: 38.744
- type: ndcg_at_3
value: 26.671
- type: ndcg_at_5
value: 28.907
- type: precision_at_1
value: 21.996
- type: precision_at_10
value: 5.009
- type: precision_at_100
value: 0.799
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 11.275
- type: precision_at_5
value: 8.059
- type: recall_at_1
value: 19.883
- type: recall_at_10
value: 43.132999999999996
- type: recall_at_100
value: 65.654
- type: recall_at_1000
value: 84.492
- type: recall_at_3
value: 30.209000000000003
- type: recall_at_5
value: 35.616
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 17.756
- type: map_at_10
value: 30.378
- type: map_at_100
value: 32.537
- type: map_at_1000
value: 32.717
- type: map_at_3
value: 25.599
- type: map_at_5
value: 28.372999999999998
- type: mrr_at_1
value: 41.303
- type: mrr_at_10
value: 53.483999999999995
- type: mrr_at_100
value: 54.106
- type: mrr_at_1000
value: 54.127
- type: mrr_at_3
value: 50.315
- type: mrr_at_5
value: 52.396
- type: ndcg_at_1
value: 41.303
- type: ndcg_at_10
value: 40.503
- type: ndcg_at_100
value: 47.821000000000005
- type: ndcg_at_1000
value: 50.788
- type: ndcg_at_3
value: 34.364
- type: ndcg_at_5
value: 36.818
- type: precision_at_1
value: 41.303
- type: precision_at_10
value: 12.463000000000001
- type: precision_at_100
value: 2.037
- type: precision_at_1000
value: 0.26
- type: precision_at_3
value: 25.798
- type: precision_at_5
value: 19.896
- type: recall_at_1
value: 17.756
- type: recall_at_10
value: 46.102
- type: recall_at_100
value: 70.819
- type: recall_at_1000
value: 87.21799999999999
- type: recall_at_3
value: 30.646
- type: recall_at_5
value: 38.022
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 9.033
- type: map_at_10
value: 20.584
- type: map_at_100
value: 29.518
- type: map_at_1000
value: 31.186000000000003
- type: map_at_3
value: 14.468
- type: map_at_5
value: 17.177
- type: mrr_at_1
value: 69.75
- type: mrr_at_10
value: 77.025
- type: mrr_at_100
value: 77.36699999999999
- type: mrr_at_1000
value: 77.373
- type: mrr_at_3
value: 75.583
- type: mrr_at_5
value: 76.396
- type: ndcg_at_1
value: 58.5
- type: ndcg_at_10
value: 45.033
- type: ndcg_at_100
value: 49.071
- type: ndcg_at_1000
value: 56.056
- type: ndcg_at_3
value: 49.936
- type: ndcg_at_5
value: 47.471999999999994
- type: precision_at_1
value: 69.75
- type: precision_at_10
value: 35.775
- type: precision_at_100
value: 11.594999999999999
- type: precision_at_1000
value: 2.062
- type: precision_at_3
value: 52.5
- type: precision_at_5
value: 45.300000000000004
- type: recall_at_1
value: 9.033
- type: recall_at_10
value: 26.596999999999998
- type: recall_at_100
value: 54.607000000000006
- type: recall_at_1000
value: 76.961
- type: recall_at_3
value: 15.754999999999999
- type: recall_at_5
value: 20.033
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 48.345000000000006
- type: f1
value: 43.4514918068706
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 71.29100000000001
- type: map_at_10
value: 81.059
- type: map_at_100
value: 81.341
- type: map_at_1000
value: 81.355
- type: map_at_3
value: 79.74799999999999
- type: map_at_5
value: 80.612
- type: mrr_at_1
value: 76.40299999999999
- type: mrr_at_10
value: 84.615
- type: mrr_at_100
value: 84.745
- type: mrr_at_1000
value: 84.748
- type: mrr_at_3
value: 83.776
- type: mrr_at_5
value: 84.343
- type: ndcg_at_1
value: 76.40299999999999
- type: ndcg_at_10
value: 84.981
- type: ndcg_at_100
value: 86.00999999999999
- type: ndcg_at_1000
value: 86.252
- type: ndcg_at_3
value: 82.97
- type: ndcg_at_5
value: 84.152
- type: precision_at_1
value: 76.40299999999999
- type: precision_at_10
value: 10.446
- type: precision_at_100
value: 1.1199999999999999
- type: precision_at_1000
value: 0.116
- type: precision_at_3
value: 32.147999999999996
- type: precision_at_5
value: 20.135
- type: recall_at_1
value: 71.29100000000001
- type: recall_at_10
value: 93.232
- type: recall_at_100
value: 97.363
- type: recall_at_1000
value: 98.905
- type: recall_at_3
value: 87.893
- type: recall_at_5
value: 90.804
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 18.667
- type: map_at_10
value: 30.853
- type: map_at_100
value: 32.494
- type: map_at_1000
value: 32.677
- type: map_at_3
value: 26.91
- type: map_at_5
value: 29.099000000000004
- type: mrr_at_1
value: 37.191
- type: mrr_at_10
value: 46.171
- type: mrr_at_100
value: 47.056
- type: mrr_at_1000
value: 47.099000000000004
- type: mrr_at_3
value: 44.059
- type: mrr_at_5
value: 45.147
- type: ndcg_at_1
value: 37.191
- type: ndcg_at_10
value: 38.437
- type: ndcg_at_100
value: 44.62
- type: ndcg_at_1000
value: 47.795
- type: ndcg_at_3
value: 35.003
- type: ndcg_at_5
value: 36.006
- type: precision_at_1
value: 37.191
- type: precision_at_10
value: 10.586
- type: precision_at_100
value: 1.688
- type: precision_at_1000
value: 0.22699999999999998
- type: precision_at_3
value: 23.302
- type: precision_at_5
value: 17.006
- type: recall_at_1
value: 18.667
- type: recall_at_10
value: 45.367000000000004
- type: recall_at_100
value: 68.207
- type: recall_at_1000
value: 87.072
- type: recall_at_3
value: 32.129000000000005
- type: recall_at_5
value: 37.719
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 39.494
- type: map_at_10
value: 66.223
- type: map_at_100
value: 67.062
- type: map_at_1000
value: 67.11500000000001
- type: map_at_3
value: 62.867
- type: map_at_5
value: 64.994
- type: mrr_at_1
value: 78.987
- type: mrr_at_10
value: 84.585
- type: mrr_at_100
value: 84.773
- type: mrr_at_1000
value: 84.77900000000001
- type: mrr_at_3
value: 83.592
- type: mrr_at_5
value: 84.235
- type: ndcg_at_1
value: 78.987
- type: ndcg_at_10
value: 73.64
- type: ndcg_at_100
value: 76.519
- type: ndcg_at_1000
value: 77.51
- type: ndcg_at_3
value: 68.893
- type: ndcg_at_5
value: 71.585
- type: precision_at_1
value: 78.987
- type: precision_at_10
value: 15.529000000000002
- type: precision_at_100
value: 1.7770000000000001
- type: precision_at_1000
value: 0.191
- type: precision_at_3
value: 44.808
- type: precision_at_5
value: 29.006999999999998
- type: recall_at_1
value: 39.494
- type: recall_at_10
value: 77.643
- type: recall_at_100
value: 88.825
- type: recall_at_1000
value: 95.321
- type: recall_at_3
value: 67.211
- type: recall_at_5
value: 72.519
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 85.55959999999999
- type: ap
value: 80.7246500384617
- type: f1
value: 85.52336485065454
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 23.631
- type: map_at_10
value: 36.264
- type: map_at_100
value: 37.428
- type: map_at_1000
value: 37.472
- type: map_at_3
value: 32.537
- type: map_at_5
value: 34.746
- type: mrr_at_1
value: 24.312
- type: mrr_at_10
value: 36.858000000000004
- type: mrr_at_100
value: 37.966
- type: mrr_at_1000
value: 38.004
- type: mrr_at_3
value: 33.188
- type: mrr_at_5
value: 35.367
- type: ndcg_at_1
value: 24.312
- type: ndcg_at_10
value: 43.126999999999995
- type: ndcg_at_100
value: 48.642
- type: ndcg_at_1000
value: 49.741
- type: ndcg_at_3
value: 35.589
- type: ndcg_at_5
value: 39.515
- type: precision_at_1
value: 24.312
- type: precision_at_10
value: 6.699
- type: precision_at_100
value: 0.9450000000000001
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 15.153
- type: precision_at_5
value: 11.065999999999999
- type: recall_at_1
value: 23.631
- type: recall_at_10
value: 64.145
- type: recall_at_100
value: 89.41
- type: recall_at_1000
value: 97.83500000000001
- type: recall_at_3
value: 43.769000000000005
- type: recall_at_5
value: 53.169
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 93.4108527131783
- type: f1
value: 93.1415880261038
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 77.24806201550388
- type: f1
value: 60.531916308197175
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 73.71553463349024
- type: f1
value: 71.70753174900791
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 77.79757901815736
- type: f1
value: 77.83719850433258
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 33.74193296622113
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 30.64257594108566
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 30.811018518883625
- type: mrr
value: 31.910376577445003
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.409
- type: map_at_10
value: 13.093
- type: map_at_100
value: 16.256999999999998
- type: map_at_1000
value: 17.617
- type: map_at_3
value: 9.555
- type: map_at_5
value: 11.428
- type: mrr_at_1
value: 45.201
- type: mrr_at_10
value: 54.179
- type: mrr_at_100
value: 54.812000000000005
- type: mrr_at_1000
value: 54.840999999999994
- type: mrr_at_3
value: 51.909000000000006
- type: mrr_at_5
value: 53.519000000000005
- type: ndcg_at_1
value: 43.189
- type: ndcg_at_10
value: 35.028
- type: ndcg_at_100
value: 31.226
- type: ndcg_at_1000
value: 39.678000000000004
- type: ndcg_at_3
value: 40.596
- type: ndcg_at_5
value: 38.75
- type: precision_at_1
value: 44.582
- type: precision_at_10
value: 25.974999999999998
- type: precision_at_100
value: 7.793
- type: precision_at_1000
value: 2.036
- type: precision_at_3
value: 38.493
- type: precision_at_5
value: 33.994
- type: recall_at_1
value: 5.409
- type: recall_at_10
value: 16.875999999999998
- type: recall_at_100
value: 30.316
- type: recall_at_1000
value: 60.891
- type: recall_at_3
value: 10.688
- type: recall_at_5
value: 13.832
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 36.375
- type: map_at_10
value: 51.991
- type: map_at_100
value: 52.91400000000001
- type: map_at_1000
value: 52.93600000000001
- type: map_at_3
value: 48.014
- type: map_at_5
value: 50.381
- type: mrr_at_1
value: 40.759
- type: mrr_at_10
value: 54.617000000000004
- type: mrr_at_100
value: 55.301
- type: mrr_at_1000
value: 55.315000000000005
- type: mrr_at_3
value: 51.516
- type: mrr_at_5
value: 53.435
- type: ndcg_at_1
value: 40.759
- type: ndcg_at_10
value: 59.384
- type: ndcg_at_100
value: 63.157
- type: ndcg_at_1000
value: 63.654999999999994
- type: ndcg_at_3
value: 52.114000000000004
- type: ndcg_at_5
value: 55.986000000000004
- type: precision_at_1
value: 40.759
- type: precision_at_10
value: 9.411999999999999
- type: precision_at_100
value: 1.153
- type: precision_at_1000
value: 0.12
- type: precision_at_3
value: 23.329
- type: precision_at_5
value: 16.256999999999998
- type: recall_at_1
value: 36.375
- type: recall_at_10
value: 79.053
- type: recall_at_100
value: 95.167
- type: recall_at_1000
value: 98.82
- type: recall_at_3
value: 60.475
- type: recall_at_5
value: 69.327
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 70.256
- type: map_at_10
value: 83.8
- type: map_at_100
value: 84.425
- type: map_at_1000
value: 84.444
- type: map_at_3
value: 80.906
- type: map_at_5
value: 82.717
- type: mrr_at_1
value: 80.97999999999999
- type: mrr_at_10
value: 87.161
- type: mrr_at_100
value: 87.262
- type: mrr_at_1000
value: 87.263
- type: mrr_at_3
value: 86.175
- type: mrr_at_5
value: 86.848
- type: ndcg_at_1
value: 80.97999999999999
- type: ndcg_at_10
value: 87.697
- type: ndcg_at_100
value: 88.959
- type: ndcg_at_1000
value: 89.09899999999999
- type: ndcg_at_3
value: 84.83800000000001
- type: ndcg_at_5
value: 86.401
- type: precision_at_1
value: 80.97999999999999
- type: precision_at_10
value: 13.261000000000001
- type: precision_at_100
value: 1.5150000000000001
- type: precision_at_1000
value: 0.156
- type: precision_at_3
value: 37.01
- type: precision_at_5
value: 24.298000000000002
- type: recall_at_1
value: 70.256
- type: recall_at_10
value: 94.935
- type: recall_at_100
value: 99.274
- type: recall_at_1000
value: 99.928
- type: recall_at_3
value: 86.602
- type: recall_at_5
value: 91.133
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 56.322692497613104
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 61.895813503775074
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.338
- type: map_at_10
value: 10.767
- type: map_at_100
value: 12.537999999999998
- type: map_at_1000
value: 12.803999999999998
- type: map_at_3
value: 7.788
- type: map_at_5
value: 9.302000000000001
- type: mrr_at_1
value: 21.4
- type: mrr_at_10
value: 31.637999999999998
- type: mrr_at_100
value: 32.688
- type: mrr_at_1000
value: 32.756
- type: mrr_at_3
value: 28.433000000000003
- type: mrr_at_5
value: 30.178
- type: ndcg_at_1
value: 21.4
- type: ndcg_at_10
value: 18.293
- type: ndcg_at_100
value: 25.274
- type: ndcg_at_1000
value: 30.284
- type: ndcg_at_3
value: 17.391000000000002
- type: ndcg_at_5
value: 15.146999999999998
- type: precision_at_1
value: 21.4
- type: precision_at_10
value: 9.48
- type: precision_at_100
value: 1.949
- type: precision_at_1000
value: 0.316
- type: precision_at_3
value: 16.167
- type: precision_at_5
value: 13.22
- type: recall_at_1
value: 4.338
- type: recall_at_10
value: 19.213
- type: recall_at_100
value: 39.562999999999995
- type: recall_at_1000
value: 64.08
- type: recall_at_3
value: 9.828000000000001
- type: recall_at_5
value: 13.383000000000001
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 82.42568163642142
- type: cos_sim_spearman
value: 78.5797159641342
- type: euclidean_pearson
value: 80.22151260811604
- type: euclidean_spearman
value: 78.5797151953878
- type: manhattan_pearson
value: 80.21224215864788
- type: manhattan_spearman
value: 78.55641478381344
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 85.44020710812569
- type: cos_sim_spearman
value: 78.91631735081286
- type: euclidean_pearson
value: 81.64188964182102
- type: euclidean_spearman
value: 78.91633286881678
- type: manhattan_pearson
value: 81.69294748512496
- type: manhattan_spearman
value: 78.93438558002656
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 84.27165426412311
- type: cos_sim_spearman
value: 85.40429140249618
- type: euclidean_pearson
value: 84.7509580724893
- type: euclidean_spearman
value: 85.40429140249618
- type: manhattan_pearson
value: 84.76488289321308
- type: manhattan_spearman
value: 85.4256793698708
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 83.138851760732
- type: cos_sim_spearman
value: 81.64101363896586
- type: euclidean_pearson
value: 82.55165038934942
- type: euclidean_spearman
value: 81.64105257080502
- type: manhattan_pearson
value: 82.52802949883335
- type: manhattan_spearman
value: 81.61255430718158
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 86.0654695484029
- type: cos_sim_spearman
value: 87.20408521902229
- type: euclidean_pearson
value: 86.8110651362115
- type: euclidean_spearman
value: 87.20408521902229
- type: manhattan_pearson
value: 86.77984656478691
- type: manhattan_spearman
value: 87.1719947099227
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 83.77823915496512
- type: cos_sim_spearman
value: 85.43566325729779
- type: euclidean_pearson
value: 84.5396956658821
- type: euclidean_spearman
value: 85.43566325729779
- type: manhattan_pearson
value: 84.5665398848169
- type: manhattan_spearman
value: 85.44375870303232
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 87.20030208471798
- type: cos_sim_spearman
value: 87.20485505076539
- type: euclidean_pearson
value: 88.10588324368722
- type: euclidean_spearman
value: 87.20485505076539
- type: manhattan_pearson
value: 87.92324770415183
- type: manhattan_spearman
value: 87.0571314561877
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 63.06093161604453
- type: cos_sim_spearman
value: 64.2163140357722
- type: euclidean_pearson
value: 65.27589680994006
- type: euclidean_spearman
value: 64.2163140357722
- type: manhattan_pearson
value: 65.45904383711101
- type: manhattan_spearman
value: 64.55404716679305
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 84.32976164578706
- type: cos_sim_spearman
value: 85.54302197678368
- type: euclidean_pearson
value: 85.26307149193056
- type: euclidean_spearman
value: 85.54302197678368
- type: manhattan_pearson
value: 85.26647282029371
- type: manhattan_spearman
value: 85.5316135265568
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 81.44675968318754
- type: mrr
value: 94.92741826075158
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 56.34400000000001
- type: map_at_10
value: 65.927
- type: map_at_100
value: 66.431
- type: map_at_1000
value: 66.461
- type: map_at_3
value: 63.529
- type: map_at_5
value: 64.818
- type: mrr_at_1
value: 59.333000000000006
- type: mrr_at_10
value: 67.54599999999999
- type: mrr_at_100
value: 67.892
- type: mrr_at_1000
value: 67.917
- type: mrr_at_3
value: 65.778
- type: mrr_at_5
value: 66.794
- type: ndcg_at_1
value: 59.333000000000006
- type: ndcg_at_10
value: 70.5
- type: ndcg_at_100
value: 72.688
- type: ndcg_at_1000
value: 73.483
- type: ndcg_at_3
value: 66.338
- type: ndcg_at_5
value: 68.265
- type: precision_at_1
value: 59.333000000000006
- type: precision_at_10
value: 9.3
- type: precision_at_100
value: 1.053
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 25.889
- type: precision_at_5
value: 16.866999999999997
- type: recall_at_1
value: 56.34400000000001
- type: recall_at_10
value: 82.789
- type: recall_at_100
value: 92.767
- type: recall_at_1000
value: 99
- type: recall_at_3
value: 71.64399999999999
- type: recall_at_5
value: 76.322
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.75742574257426
- type: cos_sim_ap
value: 93.52081548447406
- type: cos_sim_f1
value: 87.33850129198966
- type: cos_sim_precision
value: 90.37433155080214
- type: cos_sim_recall
value: 84.5
- type: dot_accuracy
value: 99.75742574257426
- type: dot_ap
value: 93.52081548447406
- type: dot_f1
value: 87.33850129198966
- type: dot_precision
value: 90.37433155080214
- type: dot_recall
value: 84.5
- type: euclidean_accuracy
value: 99.75742574257426
- type: euclidean_ap
value: 93.52081548447406
- type: euclidean_f1
value: 87.33850129198966
- type: euclidean_precision
value: 90.37433155080214
- type: euclidean_recall
value: 84.5
- type: manhattan_accuracy
value: 99.75841584158415
- type: manhattan_ap
value: 93.4975678585854
- type: manhattan_f1
value: 87.26708074534162
- type: manhattan_precision
value: 90.45064377682404
- type: manhattan_recall
value: 84.3
- type: max_accuracy
value: 99.75841584158415
- type: max_ap
value: 93.52081548447406
- type: max_f1
value: 87.33850129198966
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 64.31437036686651
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 33.25569319007206
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 49.90474939720706
- type: mrr
value: 50.568115503777264
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 29.866828641244712
- type: cos_sim_spearman
value: 30.077555055873866
- type: dot_pearson
value: 29.866832988572266
- type: dot_spearman
value: 30.077555055873866
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.232
- type: map_at_10
value: 2.094
- type: map_at_100
value: 11.971
- type: map_at_1000
value: 28.158
- type: map_at_3
value: 0.688
- type: map_at_5
value: 1.114
- type: mrr_at_1
value: 88
- type: mrr_at_10
value: 93.4
- type: mrr_at_100
value: 93.4
- type: mrr_at_1000
value: 93.4
- type: mrr_at_3
value: 93
- type: mrr_at_5
value: 93.4
- type: ndcg_at_1
value: 84
- type: ndcg_at_10
value: 79.923
- type: ndcg_at_100
value: 61.17
- type: ndcg_at_1000
value: 53.03
- type: ndcg_at_3
value: 84.592
- type: ndcg_at_5
value: 82.821
- type: precision_at_1
value: 88
- type: precision_at_10
value: 85
- type: precision_at_100
value: 63.019999999999996
- type: precision_at_1000
value: 23.554
- type: precision_at_3
value: 89.333
- type: precision_at_5
value: 87.2
- type: recall_at_1
value: 0.232
- type: recall_at_10
value: 2.255
- type: recall_at_100
value: 14.823
- type: recall_at_1000
value: 49.456
- type: recall_at_3
value: 0.718
- type: recall_at_5
value: 1.175
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 2.547
- type: map_at_10
value: 11.375
- type: map_at_100
value: 18.194
- type: map_at_1000
value: 19.749
- type: map_at_3
value: 5.825
- type: map_at_5
value: 8.581
- type: mrr_at_1
value: 32.653
- type: mrr_at_10
value: 51.32
- type: mrr_at_100
value: 51.747
- type: mrr_at_1000
value: 51.747
- type: mrr_at_3
value: 47.278999999999996
- type: mrr_at_5
value: 48.605
- type: ndcg_at_1
value: 29.592000000000002
- type: ndcg_at_10
value: 28.151
- type: ndcg_at_100
value: 39.438
- type: ndcg_at_1000
value: 50.769
- type: ndcg_at_3
value: 30.758999999999997
- type: ndcg_at_5
value: 30.366
- type: precision_at_1
value: 32.653
- type: precision_at_10
value: 25.714
- type: precision_at_100
value: 8.041
- type: precision_at_1000
value: 1.555
- type: precision_at_3
value: 33.333
- type: precision_at_5
value: 31.837
- type: recall_at_1
value: 2.547
- type: recall_at_10
value: 18.19
- type: recall_at_100
value: 49.538
- type: recall_at_1000
value: 83.86
- type: recall_at_3
value: 7.329
- type: recall_at_5
value: 11.532
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 71.4952
- type: ap
value: 14.793362635531409
- type: f1
value: 55.204635551516915
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 61.5365025466893
- type: f1
value: 61.81742556334845
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 49.05531070301185
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 86.51725576682364
- type: cos_sim_ap
value: 75.2292304265163
- type: cos_sim_f1
value: 69.54022988505749
- type: cos_sim_precision
value: 63.65629110039457
- type: cos_sim_recall
value: 76.62269129287598
- type: dot_accuracy
value: 86.51725576682364
- type: dot_ap
value: 75.22922386081054
- type: dot_f1
value: 69.54022988505749
- type: dot_precision
value: 63.65629110039457
- type: dot_recall
value: 76.62269129287598
- type: euclidean_accuracy
value: 86.51725576682364
- type: euclidean_ap
value: 75.22925730473472
- type: euclidean_f1
value: 69.54022988505749
- type: euclidean_precision
value: 63.65629110039457
- type: euclidean_recall
value: 76.62269129287598
- type: manhattan_accuracy
value: 86.52321630804077
- type: manhattan_ap
value: 75.20608115037336
- type: manhattan_f1
value: 69.60000000000001
- type: manhattan_precision
value: 64.37219730941705
- type: manhattan_recall
value: 75.75197889182058
- type: max_accuracy
value: 86.52321630804077
- type: max_ap
value: 75.22925730473472
- type: max_f1
value: 69.60000000000001
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.34877944657896
- type: cos_sim_ap
value: 86.71257569277373
- type: cos_sim_f1
value: 79.10386355986088
- type: cos_sim_precision
value: 76.91468470434214
- type: cos_sim_recall
value: 81.4213119802895
- type: dot_accuracy
value: 89.34877944657896
- type: dot_ap
value: 86.71257133133368
- type: dot_f1
value: 79.10386355986088
- type: dot_precision
value: 76.91468470434214
- type: dot_recall
value: 81.4213119802895
- type: euclidean_accuracy
value: 89.34877944657896
- type: euclidean_ap
value: 86.71257651501476
- type: euclidean_f1
value: 79.10386355986088
- type: euclidean_precision
value: 76.91468470434214
- type: euclidean_recall
value: 81.4213119802895
- type: manhattan_accuracy
value: 89.35848177901967
- type: manhattan_ap
value: 86.69330615469126
- type: manhattan_f1
value: 79.13867741453949
- type: manhattan_precision
value: 76.78881807647741
- type: manhattan_recall
value: 81.63689559593472
- type: max_accuracy
value: 89.35848177901967
- type: max_ap
value: 86.71257651501476
- type: max_f1
value: 79.13867741453949
---
# nomic-embed-text-v1: A Reproducible Long Context (8192) Text Embedder
`nomic-embed-text-v1` is 8192 context length text encoder that surpasses OpenAI text-embedding-ada-002 and text-embedding-3-small performance on short and long context tasks.
| Name | SeqLen | MTEB | LoCo | Jina Long Context | Open Weights | Open Training Code | Open Data |
| :-------------------------------:| :----- | :-------- | :------: | :---------------: | :-----------: | :----------------: | :---------- |
| nomic-embed-text-v1 | 8192 | **62.39** |**85.53** | 54.16 | ✅ | ✅ | ✅ |
| jina-embeddings-v2-base-en | 8192 | 60.39 | 85.45 | 51.90 | ✅ | ❌ | ❌ |
| text-embedding-3-small | 8191 | 62.26 | 82.40 | **58.20** | ❌ | ❌ | ❌ |
| text-embedding-ada-002 | 8191 | 60.99 | 52.7 | 55.25 | ❌ | ❌ | ❌ |
## Hosted Inference API
The easiest way to get started with Nomic Embed is through the Nomic Embedding API.
Generating embeddings with the `nomic` Python client is as easy as
```python
from nomic import embed
output = embed.text(
texts=['Nomic Embedding API', '#keepAIOpen'],
model='nomic-embed-text-v1',
task_type='search_document'
)
print(output)
```
For more information, see the [API reference](https://docs.nomic.ai/reference/endpoints/nomic-embed-text)
## Data Visualization
Click the Nomic Atlas map below to visualize a 5M sample of our contrastive pretraining data!
[](https://atlas.nomic.ai/map/nomic-text-embed-v1-5m-sample)
## Training Details
We train our embedder using a multi-stage training pipeline. Starting from a long-context [BERT model](https://huggingface.co/nomic-ai/nomic-bert-2048),
the first unsupervised contrastive stage trains on a dataset generated from weakly related text pairs, such as question-answer pairs from forums like StackExchange and Quora, title-body pairs from Amazon reviews, and summarizations from news articles.
In the second finetuning stage, higher quality labeled datasets such as search queries and answers from web searches are leveraged. Data curation and hard-example mining is crucial in this stage.
For more details, see the Nomic Embed [Technical Report](https://static.nomic.ai/reports/2024_Nomic_Embed_Text_Technical_Report.pdf) and corresponding [blog post](https://blog.nomic.ai/posts/nomic-embed-text-v1).
Training data to train the models is released in its entirety. For more details, see the `contrastors` [repository](https://github.com/nomic-ai/contrastors)
## Usage
Note `nomic-embed-text` requires prefixes! We support the prefixes `[search_query, search_document, classification, clustering]`.
For retrieval applications, you should prepend `search_document` for all your documents and `search_query` for your queries.
### Sentence Transformers
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer("nomic-ai/nomic-embed-text-v1", trust_remote_code=True)
sentences = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?']
embeddings = model.encode(sentences)
print(embeddings)
```
### Transformers
```python
import torch
import torch.nn.functional as F
from transformers import AutoTokenizer, AutoModel
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0]
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
sentences = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?']
tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased')
model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True)
model.eval()
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
with torch.no_grad():
model_output = model(**encoded_input)
embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
embeddings = F.normalize(embeddings, p=2, dim=1)
print(embeddings)
```
The model natively supports scaling of the sequence length past 2048 tokens. To do so,
```diff
- tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased')
+ tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased', model_max_length=8192)
- model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True)
+ model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True, rotary_scaling_factor=2)
```
### Transformers.js
```js
import { pipeline } from '@xenova/transformers';
// Create a feature extraction pipeline
const extractor = await pipeline('feature-extraction', 'nomic-ai/nomic-embed-text-v1', {
quantized: false, // Comment out this line to use the quantized version
});
// Compute sentence embeddings
const texts = ['What is TSNE?', 'Who is Laurens van der Maaten?'];
const embeddings = await extractor(texts, { pooling: 'mean', normalize: true });
console.log(embeddings);
```
# Join the Nomic Community
- Nomic: [https://nomic.ai](https://nomic.ai)
- Discord: [https://discord.gg/myY5YDR8z8](https://discord.gg/myY5YDR8z8)
- Twitter: [https://twitter.com/nomic_ai](https://twitter.com/nomic_ai)
# Citation
If you find the model, dataset, or training code useful, please cite our work
```bibtex
@misc{nussbaum2024nomic,
title={Nomic Embed: Training a Reproducible Long Context Text Embedder},
author={Zach Nussbaum and John X. Morris and Brandon Duderstadt and Andriy Mulyar},
year={2024},
eprint={2402.01613},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | [
"SUMMARIZATION"
]
| [
"BIOSSES",
"SCIFACT"
]
| Non_BioNLP |
# nomic-embed-text-v1: A Reproducible Long Context (8192) Text Embedder
`nomic-embed-text-v1` is 8192 context length text encoder that surpasses OpenAI text-embedding-ada-002 and text-embedding-3-small performance on short and long context tasks.
| Name | SeqLen | MTEB | LoCo | Jina Long Context | Open Weights | Open Training Code | Open Data |
| :-------------------------------:| :----- | :-------- | :------: | :---------------: | :-----------: | :----------------: | :---------- |
| nomic-embed-text-v1 | 8192 | **62.39** |**85.53** | 54.16 | ✅ | ✅ | ✅ |
| jina-embeddings-v2-base-en | 8192 | 60.39 | 85.45 | 51.90 | ✅ | ❌ | ❌ |
| text-embedding-3-small | 8191 | 62.26 | 82.40 | **58.20** | ❌ | ❌ | ❌ |
| text-embedding-ada-002 | 8191 | 60.99 | 52.7 | 55.25 | ❌ | ❌ | ❌ |
## Hosted Inference API
The easiest way to get started with Nomic Embed is through the Nomic Embedding API.
Generating embeddings with the `nomic` Python client is as easy as
```python
from nomic import embed
output = embed.text(
texts=['Nomic Embedding API', '#keepAIOpen'],
model='nomic-embed-text-v1',
task_type='search_document'
)
print(output)
```
For more information, see the [API reference](https://docs.nomic.ai/reference/endpoints/nomic-embed-text)
## Data Visualization
Click the Nomic Atlas map below to visualize a 5M sample of our contrastive pretraining data!
[](https://atlas.nomic.ai/map/nomic-text-embed-v1-5m-sample)
## Training Details
We train our embedder using a multi-stage training pipeline. Starting from a long-context [BERT model](https://huggingface.co/nomic-ai/nomic-bert-2048),
the first unsupervised contrastive stage trains on a dataset generated from weakly related text pairs, such as question-answer pairs from forums like StackExchange and Quora, title-body pairs from Amazon reviews, and summarizations from news articles.
In the second finetuning stage, higher quality labeled datasets such as search queries and answers from web searches are leveraged. Data curation and hard-example mining is crucial in this stage.
For more details, see the Nomic Embed [Technical Report](https://static.nomic.ai/reports/2024_Nomic_Embed_Text_Technical_Report.pdf) and corresponding [blog post](https://blog.nomic.ai/posts/nomic-embed-text-v1).
Training data to train the models is released in its entirety. For more details, see the `contrastors` [repository](https://github.com/nomic-ai/contrastors)
## Usage
Note `nomic-embed-text` requires prefixes! We support the prefixes `[search_query, search_document, classification, clustering]`.
For retrieval applications, you should prepend `search_document` for all your documents and `search_query` for your queries.
### Sentence Transformers
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer("nomic-ai/nomic-embed-text-v1", trust_remote_code=True)
sentences = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?']
embeddings = model.encode(sentences)
print(embeddings)
```
### Transformers
```python
import torch
import torch.nn.functional as F
from transformers import AutoTokenizer, AutoModel
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0]
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
sentences = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?']
tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased')
model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True)
model.eval()
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
with torch.no_grad():
model_output = model(**encoded_input)
embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
embeddings = F.normalize(embeddings, p=2, dim=1)
print(embeddings)
```
The model natively supports scaling of the sequence length past 2048 tokens. To do so,
```diff
- tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased')
+ tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased', model_max_length=8192)
- model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True)
+ model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True, rotary_scaling_factor=2)
```
### Transformers.js
```js
import { pipeline } from '@xenova/transformers';
// Create a feature extraction pipeline
const extractor = await pipeline('feature-extraction', 'nomic-ai/nomic-embed-text-v1', {
quantized: false, // Comment out this line to use the quantized version
});
// Compute sentence embeddings
const texts = ['What is TSNE?', 'Who is Laurens van der Maaten?'];
const embeddings = await extractor(texts, { pooling: 'mean', normalize: true });
console.log(embeddings);
```
# Join the Nomic Community
- Nomic: [https://nomic.ai](https://nomic.ai)
- Discord: [https://discord.gg/myY5YDR8z8](https://discord.gg/myY5YDR8z8)
- Twitter: [https://twitter.com/nomic_ai](https://twitter.com/nomic_ai)
# Citation
If you find the model, dataset, or training code useful, please cite our work
```bibtex
@misc{nussbaum2024nomic,
title={Nomic Embed: Training a Reproducible Long Context Text Embedder},
author={Zach Nussbaum and John X. Morris and Brandon Duderstadt and Andriy Mulyar},
year={2024},
eprint={2402.01613},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | {"language": ["en"], "library_name": "sentence-transformers", "license": "apache-2.0", "pipeline_tag": "feature-extraction", "tags": ["feature-extraction", "sentence-similarity", "mteb", "transformers", "transformers.js"], "model-index": [{"name": "epoch_0_model", "results": [{"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonCounterfactualClassification (en)", "type": "mteb/amazon_counterfactual", "config": "en", "split": "test", "revision": "e8379541af4e31359cca9fbcf4b00f2671dba205"}, "metrics": [{"type": "accuracy", "value": 76.8507462686567}, {"type": "ap", "value": 40.592189159090495}, {"type": "f1", "value": 71.01634655512476}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonPolarityClassification", "type": "mteb/amazon_polarity", "config": "default", "split": "test", "revision": "e2d317d38cd51312af73b3d32a06d1a08b442046"}, "metrics": [{"type": "accuracy", "value": 91.51892500000001}, {"type": "ap", "value": 88.50346762975335}, {"type": "f1", "value": 91.50342077459624}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification (en)", "type": "mteb/amazon_reviews_multi", "config": "en", "split": "test", "revision": "1399c76144fd37290681b995c656ef9b2e06e26d"}, "metrics": [{"type": "accuracy", "value": 47.364}, {"type": "f1", "value": 46.72708080922794}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ArguAna", "type": "arguana", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 25.178}, {"type": "map_at_10", "value": 40.244}, {"type": "map_at_100", "value": 41.321999999999996}, {"type": "map_at_1000", "value": 41.331}, {"type": "map_at_3", "value": 35.016999999999996}, {"type": "map_at_5", "value": 37.99}, {"type": "mrr_at_1", "value": 25.605}, {"type": "mrr_at_10", "value": 40.422000000000004}, {"type": "mrr_at_100", "value": 41.507}, {"type": "mrr_at_1000", "value": 41.516}, {"type": "mrr_at_3", "value": 35.23}, {"type": "mrr_at_5", "value": 38.15}, {"type": "ndcg_at_1", "value": 25.178}, {"type": "ndcg_at_10", "value": 49.258}, {"type": "ndcg_at_100", "value": 53.776}, {"type": "ndcg_at_1000", "value": 53.995000000000005}, {"type": "ndcg_at_3", "value": 38.429}, {"type": "ndcg_at_5", "value": 43.803}, {"type": "precision_at_1", "value": 25.178}, {"type": "precision_at_10", "value": 7.831}, {"type": "precision_at_100", "value": 0.979}, {"type": "precision_at_1000", "value": 0.1}, {"type": "precision_at_3", "value": 16.121}, {"type": "precision_at_5", "value": 12.29}, {"type": "recall_at_1", "value": 25.178}, {"type": "recall_at_10", "value": 78.307}, {"type": "recall_at_100", "value": 97.866}, {"type": "recall_at_1000", "value": 99.57300000000001}, {"type": "recall_at_3", "value": 48.364000000000004}, {"type": "recall_at_5", "value": 61.451}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringP2P", "type": "mteb/arxiv-clustering-p2p", "config": "default", "split": "test", "revision": "a122ad7f3f0291bf49cc6f4d32aa80929df69d5d"}, "metrics": [{"type": "v_measure", "value": 45.93034494751465}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringS2S", "type": "mteb/arxiv-clustering-s2s", "config": "default", "split": "test", "revision": "f910caf1a6075f7329cdf8c1a6135696f37dbd53"}, "metrics": [{"type": "v_measure", "value": 36.64579480054327}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB AskUbuntuDupQuestions", "type": "mteb/askubuntudupquestions-reranking", "config": "default", "split": "test", "revision": "2000358ca161889fa9c082cb41daa8dcfb161a54"}, "metrics": [{"type": "map", "value": 60.601310529222054}, {"type": "mrr", "value": 75.04484896451656}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB BIOSSES", "type": "mteb/biosses-sts", "config": "default", "split": "test", "revision": "d3fb88f8f02e40887cd149695127462bbcf29b4a"}, "metrics": [{"type": "cos_sim_pearson", "value": 88.57797718095814}, {"type": "cos_sim_spearman", "value": 86.47064499110101}, {"type": "euclidean_pearson", "value": 87.4559602783142}, {"type": "euclidean_spearman", "value": 86.47064499110101}, {"type": "manhattan_pearson", "value": 87.7232764230245}, {"type": "manhattan_spearman", "value": 86.91222131777742}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB Banking77Classification", "type": "mteb/banking77", "config": "default", "split": "test", "revision": "0fd18e25b25c072e09e0d92ab615fda904d66300"}, "metrics": [{"type": "accuracy", "value": 84.5422077922078}, {"type": "f1", "value": 84.47657456950589}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringP2P", "type": "mteb/biorxiv-clustering-p2p", "config": "default", "split": "test", "revision": "65b79d1d13f80053f67aca9498d9402c2d9f1f40"}, "metrics": [{"type": "v_measure", "value": 38.48953561974464}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringS2S", "type": "mteb/biorxiv-clustering-s2s", "config": "default", "split": "test", "revision": "258694dd0231531bc1fd9de6ceb52a0853c6d908"}, "metrics": [{"type": "v_measure", "value": 32.75995857510105}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackAndroidRetrieval", "type": "BeIR/cqadupstack", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 30.008000000000003}, {"type": "map_at_10", "value": 39.51}, {"type": "map_at_100", "value": 40.841}, {"type": "map_at_1000", "value": 40.973}, {"type": "map_at_3", "value": 36.248999999999995}, {"type": "map_at_5", "value": 38.096999999999994}, {"type": "mrr_at_1", "value": 36.481}, {"type": "mrr_at_10", "value": 44.818000000000005}, {"type": "mrr_at_100", "value": 45.64}, {"type": "mrr_at_1000", "value": 45.687}, {"type": "mrr_at_3", "value": 42.036}, {"type": "mrr_at_5", "value": 43.782}, {"type": "ndcg_at_1", "value": 36.481}, {"type": "ndcg_at_10", "value": 45.152}, {"type": "ndcg_at_100", "value": 50.449}, {"type": "ndcg_at_1000", "value": 52.76499999999999}, {"type": "ndcg_at_3", "value": 40.161}, {"type": "ndcg_at_5", "value": 42.577999999999996}, {"type": "precision_at_1", "value": 36.481}, {"type": "precision_at_10", "value": 8.369}, {"type": "precision_at_100", "value": 1.373}, {"type": "precision_at_1000", "value": 0.186}, {"type": "precision_at_3", "value": 18.693}, {"type": "precision_at_5", "value": 13.533999999999999}, {"type": "recall_at_1", "value": 30.008000000000003}, {"type": "recall_at_10", "value": 56.108999999999995}, {"type": "recall_at_100", "value": 78.55499999999999}, {"type": "recall_at_1000", "value": 93.659}, {"type": "recall_at_3", "value": 41.754999999999995}, {"type": "recall_at_5", "value": 48.296}, {"type": "map_at_1", "value": 30.262}, {"type": "map_at_10", "value": 40.139}, {"type": "map_at_100", "value": 41.394}, {"type": "map_at_1000", "value": 41.526}, {"type": "map_at_3", "value": 37.155}, {"type": "map_at_5", "value": 38.785}, {"type": "mrr_at_1", "value": 38.153}, {"type": "mrr_at_10", "value": 46.369}, {"type": "mrr_at_100", "value": 47.072}, {"type": "mrr_at_1000", "value": 47.111999999999995}, {"type": "mrr_at_3", "value": 44.268}, {"type": "mrr_at_5", "value": 45.389}, {"type": "ndcg_at_1", "value": 38.153}, {"type": "ndcg_at_10", "value": 45.925}, {"type": "ndcg_at_100", "value": 50.394000000000005}, {"type": "ndcg_at_1000", "value": 52.37500000000001}, {"type": "ndcg_at_3", "value": 41.754000000000005}, {"type": "ndcg_at_5", "value": 43.574}, {"type": "precision_at_1", "value": 38.153}, {"type": "precision_at_10", "value": 8.796}, {"type": "precision_at_100", "value": 1.432}, {"type": "precision_at_1000", "value": 0.189}, {"type": "precision_at_3", "value": 20.318}, {"type": "precision_at_5", "value": 14.395}, {"type": "recall_at_1", "value": 30.262}, {"type": "recall_at_10", "value": 55.72200000000001}, {"type": "recall_at_100", "value": 74.97500000000001}, {"type": "recall_at_1000", "value": 87.342}, {"type": "recall_at_3", "value": 43.129}, {"type": "recall_at_5", "value": 48.336}, {"type": "map_at_1", "value": 39.951}, {"type": "map_at_10", "value": 51.248000000000005}, {"type": "map_at_100", "value": 52.188}, {"type": "map_at_1000", "value": 52.247}, {"type": "map_at_3", "value": 48.211}, {"type": "map_at_5", "value": 49.797000000000004}, {"type": "mrr_at_1", "value": 45.329}, {"type": "mrr_at_10", "value": 54.749}, {"type": "mrr_at_100", "value": 55.367999999999995}, {"type": "mrr_at_1000", "value": 55.400000000000006}, {"type": "mrr_at_3", "value": 52.382}, {"type": "mrr_at_5", "value": 53.649}, {"type": "ndcg_at_1", "value": 45.329}, {"type": "ndcg_at_10", "value": 56.847}, {"type": "ndcg_at_100", "value": 60.738}, {"type": "ndcg_at_1000", "value": 61.976}, {"type": "ndcg_at_3", "value": 51.59}, {"type": "ndcg_at_5", "value": 53.915}, {"type": "precision_at_1", "value": 45.329}, {"type": "precision_at_10", "value": 8.959}, {"type": "precision_at_100", "value": 1.187}, {"type": "precision_at_1000", "value": 0.134}, {"type": "precision_at_3", "value": 22.612}, {"type": "precision_at_5", "value": 15.273}, {"type": "recall_at_1", "value": 39.951}, {"type": "recall_at_10", "value": 70.053}, {"type": "recall_at_100", "value": 86.996}, {"type": "recall_at_1000", "value": 95.707}, {"type": "recall_at_3", "value": 56.032000000000004}, {"type": "recall_at_5", "value": 61.629999999999995}, {"type": "map_at_1", "value": 25.566}, {"type": "map_at_10", "value": 33.207}, {"type": "map_at_100", "value": 34.166000000000004}, {"type": "map_at_1000", "value": 34.245}, {"type": "map_at_3", "value": 30.94}, {"type": "map_at_5", "value": 32.01}, {"type": "mrr_at_1", "value": 27.345000000000002}, {"type": "mrr_at_10", "value": 35.193000000000005}, {"type": "mrr_at_100", "value": 35.965}, {"type": "mrr_at_1000", "value": 36.028999999999996}, {"type": "mrr_at_3", "value": 32.806000000000004}, {"type": "mrr_at_5", "value": 34.021}, {"type": "ndcg_at_1", "value": 27.345000000000002}, {"type": "ndcg_at_10", "value": 37.891999999999996}, {"type": "ndcg_at_100", "value": 42.664}, {"type": "ndcg_at_1000", "value": 44.757000000000005}, {"type": "ndcg_at_3", "value": 33.123000000000005}, {"type": "ndcg_at_5", "value": 35.035}, {"type": "precision_at_1", "value": 27.345000000000002}, {"type": "precision_at_10", "value": 5.763}, {"type": "precision_at_100", "value": 0.859}, {"type": "precision_at_1000", "value": 0.108}, {"type": "precision_at_3", "value": 13.71}, {"type": "precision_at_5", "value": 9.401}, {"type": "recall_at_1", "value": 25.566}, {"type": "recall_at_10", "value": 50.563}, {"type": "recall_at_100", "value": 72.86399999999999}, {"type": "recall_at_1000", "value": 88.68599999999999}, {"type": "recall_at_3", "value": 37.43}, {"type": "recall_at_5", "value": 41.894999999999996}, {"type": "map_at_1", "value": 16.663}, {"type": "map_at_10", "value": 23.552}, {"type": "map_at_100", "value": 24.538}, {"type": "map_at_1000", "value": 24.661}, {"type": "map_at_3", "value": 21.085}, {"type": "map_at_5", "value": 22.391}, {"type": "mrr_at_1", "value": 20.025000000000002}, {"type": "mrr_at_10", "value": 27.643}, {"type": "mrr_at_100", "value": 28.499999999999996}, {"type": "mrr_at_1000", "value": 28.582}, {"type": "mrr_at_3", "value": 25.083}, {"type": "mrr_at_5", "value": 26.544}, {"type": "ndcg_at_1", "value": 20.025000000000002}, {"type": "ndcg_at_10", "value": 28.272000000000002}, {"type": "ndcg_at_100", "value": 33.353}, {"type": "ndcg_at_1000", "value": 36.454}, {"type": "ndcg_at_3", "value": 23.579}, {"type": "ndcg_at_5", "value": 25.685000000000002}, {"type": "precision_at_1", "value": 20.025000000000002}, {"type": "precision_at_10", "value": 5.187}, {"type": "precision_at_100", "value": 0.897}, {"type": "precision_at_1000", "value": 0.13}, {"type": "precision_at_3", "value": 10.987}, {"type": "precision_at_5", "value": 8.06}, {"type": "recall_at_1", "value": 16.663}, {"type": "recall_at_10", "value": 38.808}, {"type": "recall_at_100", "value": 61.305}, {"type": "recall_at_1000", "value": 83.571}, {"type": "recall_at_3", "value": 25.907999999999998}, {"type": "recall_at_5", "value": 31.214}, {"type": "map_at_1", "value": 27.695999999999998}, {"type": "map_at_10", "value": 37.018}, {"type": "map_at_100", "value": 38.263000000000005}, {"type": "map_at_1000", "value": 38.371}, {"type": "map_at_3", "value": 34.226}, {"type": "map_at_5", "value": 35.809999999999995}, {"type": "mrr_at_1", "value": 32.916000000000004}, {"type": "mrr_at_10", "value": 42.067}, {"type": "mrr_at_100", "value": 42.925000000000004}, {"type": "mrr_at_1000", "value": 42.978}, {"type": "mrr_at_3", "value": 39.637}, {"type": "mrr_at_5", "value": 41.134}, {"type": "ndcg_at_1", "value": 32.916000000000004}, {"type": "ndcg_at_10", "value": 42.539}, {"type": "ndcg_at_100", "value": 47.873}, {"type": "ndcg_at_1000", "value": 50.08200000000001}, {"type": "ndcg_at_3", "value": 37.852999999999994}, {"type": "ndcg_at_5", "value": 40.201}, {"type": "precision_at_1", "value": 32.916000000000004}, {"type": "precision_at_10", "value": 7.5840000000000005}, {"type": "precision_at_100", "value": 1.199}, {"type": "precision_at_1000", "value": 0.155}, {"type": "precision_at_3", "value": 17.485}, {"type": "precision_at_5", "value": 12.512}, {"type": "recall_at_1", "value": 27.695999999999998}, {"type": "recall_at_10", "value": 53.638}, {"type": "recall_at_100", "value": 76.116}, {"type": "recall_at_1000", "value": 91.069}, {"type": "recall_at_3", "value": 41.13}, {"type": "recall_at_5", "value": 46.872}, {"type": "map_at_1", "value": 24.108}, {"type": "map_at_10", "value": 33.372}, {"type": "map_at_100", "value": 34.656}, {"type": "map_at_1000", "value": 34.768}, {"type": "map_at_3", "value": 30.830999999999996}, {"type": "map_at_5", "value": 32.204}, {"type": "mrr_at_1", "value": 29.110000000000003}, {"type": "mrr_at_10", "value": 37.979}, {"type": "mrr_at_100", "value": 38.933}, {"type": "mrr_at_1000", "value": 38.988}, {"type": "mrr_at_3", "value": 35.731}, {"type": "mrr_at_5", "value": 36.963}, {"type": "ndcg_at_1", "value": 29.110000000000003}, {"type": "ndcg_at_10", "value": 38.635000000000005}, {"type": "ndcg_at_100", "value": 44.324999999999996}, {"type": "ndcg_at_1000", "value": 46.747}, {"type": "ndcg_at_3", "value": 34.37}, {"type": "ndcg_at_5", "value": 36.228}, {"type": "precision_at_1", "value": 29.110000000000003}, {"type": "precision_at_10", "value": 6.963}, {"type": "precision_at_100", "value": 1.146}, {"type": "precision_at_1000", "value": 0.152}, {"type": "precision_at_3", "value": 16.400000000000002}, {"type": "precision_at_5", "value": 11.552999999999999}, {"type": "recall_at_1", "value": 24.108}, {"type": "recall_at_10", "value": 49.597}, {"type": "recall_at_100", "value": 73.88900000000001}, {"type": "recall_at_1000", "value": 90.62400000000001}, {"type": "recall_at_3", "value": 37.662}, {"type": "recall_at_5", "value": 42.565}, {"type": "map_at_1", "value": 25.00791666666667}, {"type": "map_at_10", "value": 33.287749999999996}, {"type": "map_at_100", "value": 34.41141666666667}, {"type": "map_at_1000", "value": 34.52583333333333}, {"type": "map_at_3", "value": 30.734416666666668}, {"type": "map_at_5", "value": 32.137166666666666}, {"type": "mrr_at_1", "value": 29.305666666666664}, {"type": "mrr_at_10", "value": 37.22966666666666}, {"type": "mrr_at_100", "value": 38.066583333333334}, {"type": "mrr_at_1000", "value": 38.12616666666667}, {"type": "mrr_at_3", "value": 34.92275}, {"type": "mrr_at_5", "value": 36.23333333333334}, {"type": "ndcg_at_1", "value": 29.305666666666664}, {"type": "ndcg_at_10", "value": 38.25533333333333}, {"type": "ndcg_at_100", "value": 43.25266666666666}, {"type": "ndcg_at_1000", "value": 45.63583333333334}, {"type": "ndcg_at_3", "value": 33.777166666666666}, {"type": "ndcg_at_5", "value": 35.85}, {"type": "precision_at_1", "value": 29.305666666666664}, {"type": "precision_at_10", "value": 6.596416666666667}, {"type": "precision_at_100", "value": 1.0784166666666668}, {"type": "precision_at_1000", "value": 0.14666666666666664}, {"type": "precision_at_3", "value": 15.31075}, {"type": "precision_at_5", "value": 10.830916666666667}, {"type": "recall_at_1", "value": 25.00791666666667}, {"type": "recall_at_10", "value": 49.10933333333333}, {"type": "recall_at_100", "value": 71.09216666666667}, {"type": "recall_at_1000", "value": 87.77725000000001}, {"type": "recall_at_3", "value": 36.660916666666665}, {"type": "recall_at_5", "value": 41.94149999999999}, {"type": "map_at_1", "value": 23.521}, {"type": "map_at_10", "value": 30.043}, {"type": "map_at_100", "value": 30.936000000000003}, {"type": "map_at_1000", "value": 31.022}, {"type": "map_at_3", "value": 27.926000000000002}, {"type": "map_at_5", "value": 29.076999999999998}, {"type": "mrr_at_1", "value": 26.227}, {"type": "mrr_at_10", "value": 32.822}, {"type": "mrr_at_100", "value": 33.61}, {"type": "mrr_at_1000", "value": 33.672000000000004}, {"type": "mrr_at_3", "value": 30.776999999999997}, {"type": "mrr_at_5", "value": 31.866}, {"type": "ndcg_at_1", "value": 26.227}, {"type": "ndcg_at_10", "value": 34.041}, {"type": "ndcg_at_100", "value": 38.394}, {"type": "ndcg_at_1000", "value": 40.732}, {"type": "ndcg_at_3", "value": 30.037999999999997}, {"type": "ndcg_at_5", "value": 31.845000000000002}, {"type": "precision_at_1", "value": 26.227}, {"type": "precision_at_10", "value": 5.244999999999999}, {"type": "precision_at_100", "value": 0.808}, {"type": "precision_at_1000", "value": 0.107}, {"type": "precision_at_3", "value": 12.679000000000002}, {"type": "precision_at_5", "value": 8.773}, {"type": "recall_at_1", "value": 23.521}, {"type": "recall_at_10", "value": 43.633}, {"type": "recall_at_100", "value": 63.126000000000005}, {"type": "recall_at_1000", "value": 80.765}, {"type": "recall_at_3", "value": 32.614}, {"type": "recall_at_5", "value": 37.15}, {"type": "map_at_1", "value": 16.236}, {"type": "map_at_10", "value": 22.898}, {"type": "map_at_100", "value": 23.878}, {"type": "map_at_1000", "value": 24.009}, {"type": "map_at_3", "value": 20.87}, {"type": "map_at_5", "value": 22.025}, {"type": "mrr_at_1", "value": 19.339000000000002}, {"type": "mrr_at_10", "value": 26.382}, {"type": "mrr_at_100", "value": 27.245}, {"type": "mrr_at_1000", "value": 27.33}, {"type": "mrr_at_3", "value": 24.386}, {"type": "mrr_at_5", "value": 25.496000000000002}, {"type": "ndcg_at_1", "value": 19.339000000000002}, {"type": "ndcg_at_10", "value": 27.139999999999997}, {"type": "ndcg_at_100", "value": 31.944}, {"type": "ndcg_at_1000", "value": 35.077999999999996}, {"type": "ndcg_at_3", "value": 23.424}, {"type": "ndcg_at_5", "value": 25.188}, {"type": "precision_at_1", "value": 19.339000000000002}, {"type": "precision_at_10", "value": 4.8309999999999995}, {"type": "precision_at_100", "value": 0.845}, {"type": "precision_at_1000", "value": 0.128}, {"type": "precision_at_3", "value": 10.874}, {"type": "precision_at_5", "value": 7.825}, {"type": "recall_at_1", "value": 16.236}, {"type": "recall_at_10", "value": 36.513}, {"type": "recall_at_100", "value": 57.999}, {"type": "recall_at_1000", "value": 80.512}, {"type": "recall_at_3", "value": 26.179999999999996}, {"type": "recall_at_5", "value": 30.712}, {"type": "map_at_1", "value": 24.11}, {"type": "map_at_10", "value": 31.566}, {"type": "map_at_100", "value": 32.647}, {"type": "map_at_1000", "value": 32.753}, {"type": "map_at_3", "value": 29.24}, {"type": "map_at_5", "value": 30.564999999999998}, {"type": "mrr_at_1", "value": 28.265}, {"type": "mrr_at_10", "value": 35.504000000000005}, {"type": "mrr_at_100", "value": 36.436}, {"type": "mrr_at_1000", "value": 36.503}, {"type": "mrr_at_3", "value": 33.349000000000004}, {"type": "mrr_at_5", "value": 34.622}, {"type": "ndcg_at_1", "value": 28.265}, {"type": "ndcg_at_10", "value": 36.192}, {"type": "ndcg_at_100", "value": 41.388000000000005}, {"type": "ndcg_at_1000", "value": 43.948}, {"type": "ndcg_at_3", "value": 31.959}, {"type": "ndcg_at_5", "value": 33.998}, {"type": "precision_at_1", "value": 28.265}, {"type": "precision_at_10", "value": 5.989}, {"type": "precision_at_100", "value": 0.9650000000000001}, {"type": "precision_at_1000", "value": 0.13}, {"type": "precision_at_3", "value": 14.335}, {"type": "precision_at_5", "value": 10.112}, {"type": "recall_at_1", "value": 24.11}, {"type": "recall_at_10", "value": 46.418}, {"type": "recall_at_100", "value": 69.314}, {"type": "recall_at_1000", "value": 87.397}, {"type": "recall_at_3", "value": 34.724}, {"type": "recall_at_5", "value": 39.925}, {"type": "map_at_1", "value": 22.091}, {"type": "map_at_10", "value": 29.948999999999998}, {"type": "map_at_100", "value": 31.502000000000002}, {"type": "map_at_1000", "value": 31.713}, {"type": "map_at_3", "value": 27.464}, {"type": "map_at_5", "value": 28.968}, {"type": "mrr_at_1", "value": 26.482}, {"type": "mrr_at_10", "value": 34.009}, {"type": "mrr_at_100", "value": 35.081}, {"type": "mrr_at_1000", "value": 35.138000000000005}, {"type": "mrr_at_3", "value": 31.785000000000004}, {"type": "mrr_at_5", "value": 33.178999999999995}, {"type": "ndcg_at_1", "value": 26.482}, {"type": "ndcg_at_10", "value": 35.008}, {"type": "ndcg_at_100", "value": 41.272999999999996}, {"type": "ndcg_at_1000", "value": 43.972}, {"type": "ndcg_at_3", "value": 30.804}, {"type": "ndcg_at_5", "value": 33.046}, {"type": "precision_at_1", "value": 26.482}, {"type": "precision_at_10", "value": 6.462}, {"type": "precision_at_100", "value": 1.431}, {"type": "precision_at_1000", "value": 0.22899999999999998}, {"type": "precision_at_3", "value": 14.360999999999999}, {"type": "precision_at_5", "value": 10.474}, {"type": "recall_at_1", "value": 22.091}, {"type": "recall_at_10", "value": 45.125}, {"type": "recall_at_100", "value": 72.313}, {"type": "recall_at_1000", "value": 89.503}, {"type": "recall_at_3", "value": 33.158}, {"type": "recall_at_5", "value": 39.086999999999996}, {"type": "map_at_1", "value": 19.883}, {"type": "map_at_10", "value": 26.951000000000004}, {"type": "map_at_100", "value": 27.927999999999997}, {"type": "map_at_1000", "value": 28.022000000000002}, {"type": "map_at_3", "value": 24.616}, {"type": "map_at_5", "value": 25.917}, {"type": "mrr_at_1", "value": 21.996}, {"type": "mrr_at_10", "value": 29.221000000000004}, {"type": "mrr_at_100", "value": 30.024}, {"type": "mrr_at_1000", "value": 30.095}, {"type": "mrr_at_3", "value": 26.833000000000002}, {"type": "mrr_at_5", "value": 28.155}, {"type": "ndcg_at_1", "value": 21.996}, {"type": "ndcg_at_10", "value": 31.421}, {"type": "ndcg_at_100", "value": 36.237}, {"type": "ndcg_at_1000", "value": 38.744}, {"type": "ndcg_at_3", "value": 26.671}, {"type": "ndcg_at_5", "value": 28.907}, {"type": "precision_at_1", "value": 21.996}, {"type": "precision_at_10", "value": 5.009}, {"type": "precision_at_100", "value": 0.799}, {"type": "precision_at_1000", "value": 0.11199999999999999}, {"type": "precision_at_3", "value": 11.275}, {"type": "precision_at_5", "value": 8.059}, {"type": "recall_at_1", "value": 19.883}, {"type": "recall_at_10", "value": 43.132999999999996}, {"type": "recall_at_100", "value": 65.654}, {"type": "recall_at_1000", "value": 84.492}, {"type": "recall_at_3", "value": 30.209000000000003}, {"type": "recall_at_5", "value": 35.616}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ClimateFEVER", "type": "climate-fever", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 17.756}, {"type": "map_at_10", "value": 30.378}, {"type": "map_at_100", "value": 32.537}, {"type": "map_at_1000", "value": 32.717}, {"type": "map_at_3", "value": 25.599}, {"type": "map_at_5", "value": 28.372999999999998}, {"type": "mrr_at_1", "value": 41.303}, {"type": "mrr_at_10", "value": 53.483999999999995}, {"type": "mrr_at_100", "value": 54.106}, {"type": "mrr_at_1000", "value": 54.127}, {"type": "mrr_at_3", "value": 50.315}, {"type": "mrr_at_5", "value": 52.396}, {"type": "ndcg_at_1", "value": 41.303}, {"type": "ndcg_at_10", "value": 40.503}, {"type": "ndcg_at_100", "value": 47.821000000000005}, {"type": "ndcg_at_1000", "value": 50.788}, {"type": "ndcg_at_3", "value": 34.364}, {"type": "ndcg_at_5", "value": 36.818}, {"type": "precision_at_1", "value": 41.303}, {"type": "precision_at_10", "value": 12.463000000000001}, {"type": "precision_at_100", "value": 2.037}, {"type": "precision_at_1000", "value": 0.26}, {"type": "precision_at_3", "value": 25.798}, {"type": "precision_at_5", "value": 19.896}, {"type": "recall_at_1", "value": 17.756}, {"type": "recall_at_10", "value": 46.102}, {"type": "recall_at_100", "value": 70.819}, {"type": "recall_at_1000", "value": 87.21799999999999}, {"type": "recall_at_3", "value": 30.646}, {"type": "recall_at_5", "value": 38.022}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB DBPedia", "type": "dbpedia-entity", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 9.033}, {"type": "map_at_10", "value": 20.584}, {"type": "map_at_100", "value": 29.518}, {"type": "map_at_1000", "value": 31.186000000000003}, {"type": "map_at_3", "value": 14.468}, {"type": "map_at_5", "value": 17.177}, {"type": "mrr_at_1", "value": 69.75}, {"type": "mrr_at_10", "value": 77.025}, {"type": "mrr_at_100", "value": 77.36699999999999}, {"type": "mrr_at_1000", "value": 77.373}, {"type": "mrr_at_3", "value": 75.583}, {"type": "mrr_at_5", "value": 76.396}, {"type": "ndcg_at_1", "value": 58.5}, {"type": "ndcg_at_10", "value": 45.033}, {"type": "ndcg_at_100", "value": 49.071}, {"type": "ndcg_at_1000", "value": 56.056}, {"type": "ndcg_at_3", "value": 49.936}, {"type": "ndcg_at_5", "value": 47.471999999999994}, {"type": "precision_at_1", "value": 69.75}, {"type": "precision_at_10", "value": 35.775}, {"type": "precision_at_100", "value": 11.594999999999999}, {"type": "precision_at_1000", "value": 2.062}, {"type": "precision_at_3", "value": 52.5}, {"type": "precision_at_5", "value": 45.300000000000004}, {"type": "recall_at_1", "value": 9.033}, {"type": "recall_at_10", "value": 26.596999999999998}, {"type": "recall_at_100", "value": 54.607000000000006}, {"type": "recall_at_1000", "value": 76.961}, {"type": "recall_at_3", "value": 15.754999999999999}, {"type": "recall_at_5", "value": 20.033}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB EmotionClassification", "type": "mteb/emotion", "config": "default", "split": "test", "revision": "4f58c6b202a23cf9a4da393831edf4f9183cad37"}, "metrics": [{"type": "accuracy", "value": 48.345000000000006}, {"type": "f1", "value": 43.4514918068706}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FEVER", "type": "fever", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 71.29100000000001}, {"type": "map_at_10", "value": 81.059}, {"type": "map_at_100", "value": 81.341}, {"type": "map_at_1000", "value": 81.355}, {"type": "map_at_3", "value": 79.74799999999999}, {"type": "map_at_5", "value": 80.612}, {"type": "mrr_at_1", "value": 76.40299999999999}, {"type": "mrr_at_10", "value": 84.615}, {"type": "mrr_at_100", "value": 84.745}, {"type": "mrr_at_1000", "value": 84.748}, {"type": "mrr_at_3", "value": 83.776}, {"type": "mrr_at_5", "value": 84.343}, {"type": "ndcg_at_1", "value": 76.40299999999999}, {"type": "ndcg_at_10", "value": 84.981}, {"type": "ndcg_at_100", "value": 86.00999999999999}, {"type": "ndcg_at_1000", "value": 86.252}, {"type": "ndcg_at_3", "value": 82.97}, {"type": "ndcg_at_5", "value": 84.152}, {"type": "precision_at_1", "value": 76.40299999999999}, {"type": "precision_at_10", "value": 10.446}, {"type": "precision_at_100", "value": 1.1199999999999999}, {"type": "precision_at_1000", "value": 0.116}, {"type": "precision_at_3", "value": 32.147999999999996}, {"type": "precision_at_5", "value": 20.135}, {"type": "recall_at_1", "value": 71.29100000000001}, {"type": "recall_at_10", "value": 93.232}, {"type": "recall_at_100", "value": 97.363}, {"type": "recall_at_1000", "value": 98.905}, {"type": "recall_at_3", "value": 87.893}, {"type": "recall_at_5", "value": 90.804}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FiQA2018", "type": "fiqa", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 18.667}, {"type": "map_at_10", "value": 30.853}, {"type": "map_at_100", "value": 32.494}, {"type": "map_at_1000", "value": 32.677}, {"type": "map_at_3", "value": 26.91}, {"type": "map_at_5", "value": 29.099000000000004}, {"type": "mrr_at_1", "value": 37.191}, {"type": "mrr_at_10", "value": 46.171}, {"type": "mrr_at_100", "value": 47.056}, {"type": "mrr_at_1000", "value": 47.099000000000004}, {"type": "mrr_at_3", "value": 44.059}, {"type": "mrr_at_5", "value": 45.147}, {"type": "ndcg_at_1", "value": 37.191}, {"type": "ndcg_at_10", "value": 38.437}, {"type": "ndcg_at_100", "value": 44.62}, {"type": "ndcg_at_1000", "value": 47.795}, {"type": "ndcg_at_3", "value": 35.003}, {"type": "ndcg_at_5", "value": 36.006}, {"type": "precision_at_1", "value": 37.191}, {"type": "precision_at_10", "value": 10.586}, {"type": "precision_at_100", "value": 1.688}, {"type": "precision_at_1000", "value": 0.22699999999999998}, {"type": "precision_at_3", "value": 23.302}, {"type": "precision_at_5", "value": 17.006}, {"type": "recall_at_1", "value": 18.667}, {"type": "recall_at_10", "value": 45.367000000000004}, {"type": "recall_at_100", "value": 68.207}, {"type": "recall_at_1000", "value": 87.072}, {"type": "recall_at_3", "value": 32.129000000000005}, {"type": "recall_at_5", "value": 37.719}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB HotpotQA", "type": "hotpotqa", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 39.494}, {"type": "map_at_10", "value": 66.223}, {"type": "map_at_100", "value": 67.062}, {"type": "map_at_1000", "value": 67.11500000000001}, {"type": "map_at_3", "value": 62.867}, {"type": "map_at_5", "value": 64.994}, {"type": "mrr_at_1", "value": 78.987}, {"type": "mrr_at_10", "value": 84.585}, {"type": "mrr_at_100", "value": 84.773}, {"type": "mrr_at_1000", "value": 84.77900000000001}, {"type": "mrr_at_3", "value": 83.592}, {"type": "mrr_at_5", "value": 84.235}, {"type": "ndcg_at_1", "value": 78.987}, {"type": "ndcg_at_10", "value": 73.64}, {"type": "ndcg_at_100", "value": 76.519}, {"type": "ndcg_at_1000", "value": 77.51}, {"type": "ndcg_at_3", "value": 68.893}, {"type": "ndcg_at_5", "value": 71.585}, {"type": "precision_at_1", "value": 78.987}, {"type": "precision_at_10", "value": 15.529000000000002}, {"type": "precision_at_100", "value": 1.7770000000000001}, {"type": "precision_at_1000", "value": 0.191}, {"type": "precision_at_3", "value": 44.808}, {"type": "precision_at_5", "value": 29.006999999999998}, {"type": "recall_at_1", "value": 39.494}, {"type": "recall_at_10", "value": 77.643}, {"type": "recall_at_100", "value": 88.825}, {"type": "recall_at_1000", "value": 95.321}, {"type": "recall_at_3", "value": 67.211}, {"type": "recall_at_5", "value": 72.519}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ImdbClassification", "type": "mteb/imdb", "config": "default", "split": "test", "revision": "3d86128a09e091d6018b6d26cad27f2739fc2db7"}, "metrics": [{"type": "accuracy", "value": 85.55959999999999}, {"type": "ap", "value": 80.7246500384617}, {"type": "f1", "value": 85.52336485065454}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MSMARCO", "type": "msmarco", "config": "default", "split": "dev", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 23.631}, {"type": "map_at_10", "value": 36.264}, {"type": "map_at_100", "value": 37.428}, {"type": "map_at_1000", "value": 37.472}, {"type": "map_at_3", "value": 32.537}, {"type": "map_at_5", "value": 34.746}, {"type": "mrr_at_1", "value": 24.312}, {"type": "mrr_at_10", "value": 36.858000000000004}, {"type": "mrr_at_100", "value": 37.966}, {"type": "mrr_at_1000", "value": 38.004}, {"type": "mrr_at_3", "value": 33.188}, {"type": "mrr_at_5", "value": 35.367}, {"type": "ndcg_at_1", "value": 24.312}, {"type": "ndcg_at_10", "value": 43.126999999999995}, {"type": "ndcg_at_100", "value": 48.642}, {"type": "ndcg_at_1000", "value": 49.741}, {"type": "ndcg_at_3", "value": 35.589}, {"type": "ndcg_at_5", "value": 39.515}, {"type": "precision_at_1", "value": 24.312}, {"type": "precision_at_10", "value": 6.699}, {"type": "precision_at_100", "value": 0.9450000000000001}, {"type": "precision_at_1000", "value": 0.104}, {"type": "precision_at_3", "value": 15.153}, {"type": "precision_at_5", "value": 11.065999999999999}, {"type": "recall_at_1", "value": 23.631}, {"type": "recall_at_10", "value": 64.145}, {"type": "recall_at_100", "value": 89.41}, {"type": "recall_at_1000", "value": 97.83500000000001}, {"type": "recall_at_3", "value": 43.769000000000005}, {"type": "recall_at_5", "value": 53.169}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPDomainClassification (en)", "type": "mteb/mtop_domain", "config": "en", "split": "test", "revision": "d80d48c1eb48d3562165c59d59d0034df9fff0bf"}, "metrics": [{"type": "accuracy", "value": 93.4108527131783}, {"type": "f1", "value": 93.1415880261038}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPIntentClassification (en)", "type": "mteb/mtop_intent", "config": "en", "split": "test", "revision": "ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba"}, "metrics": [{"type": "accuracy", "value": 77.24806201550388}, {"type": "f1", "value": 60.531916308197175}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (en)", "type": "mteb/amazon_massive_intent", "config": "en", "split": "test", "revision": "31efe3c427b0bae9c22cbb560b8f15491cc6bed7"}, "metrics": [{"type": "accuracy", "value": 73.71553463349024}, {"type": "f1", "value": 71.70753174900791}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (en)", "type": "mteb/amazon_massive_scenario", "config": "en", "split": "test", "revision": "7d571f92784cd94a019292a1f45445077d0ef634"}, "metrics": [{"type": "accuracy", "value": 77.79757901815736}, {"type": "f1", "value": 77.83719850433258}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringP2P", "type": "mteb/medrxiv-clustering-p2p", "config": "default", "split": "test", "revision": "e7a26af6f3ae46b30dde8737f02c07b1505bcc73"}, "metrics": [{"type": "v_measure", "value": 33.74193296622113}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringS2S", "type": "mteb/medrxiv-clustering-s2s", "config": "default", "split": "test", "revision": "35191c8c0dca72d8ff3efcd72aa802307d469663"}, "metrics": [{"type": "v_measure", "value": 30.64257594108566}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB MindSmallReranking", "type": "mteb/mind_small", "config": "default", "split": "test", "revision": "3bdac13927fdc888b903db93b2ffdbd90b295a69"}, "metrics": [{"type": "map", "value": 30.811018518883625}, {"type": "mrr", "value": 31.910376577445003}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NFCorpus", "type": "nfcorpus", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 5.409}, {"type": "map_at_10", "value": 13.093}, {"type": "map_at_100", "value": 16.256999999999998}, {"type": "map_at_1000", "value": 17.617}, {"type": "map_at_3", "value": 9.555}, {"type": "map_at_5", "value": 11.428}, {"type": "mrr_at_1", "value": 45.201}, {"type": "mrr_at_10", "value": 54.179}, {"type": "mrr_at_100", "value": 54.812000000000005}, {"type": "mrr_at_1000", "value": 54.840999999999994}, {"type": "mrr_at_3", "value": 51.909000000000006}, {"type": "mrr_at_5", "value": 53.519000000000005}, {"type": "ndcg_at_1", "value": 43.189}, {"type": "ndcg_at_10", "value": 35.028}, {"type": "ndcg_at_100", "value": 31.226}, {"type": "ndcg_at_1000", "value": 39.678000000000004}, {"type": "ndcg_at_3", "value": 40.596}, {"type": "ndcg_at_5", "value": 38.75}, {"type": "precision_at_1", "value": 44.582}, {"type": "precision_at_10", "value": 25.974999999999998}, {"type": "precision_at_100", "value": 7.793}, {"type": "precision_at_1000", "value": 2.036}, {"type": "precision_at_3", "value": 38.493}, {"type": "precision_at_5", "value": 33.994}, {"type": "recall_at_1", "value": 5.409}, {"type": "recall_at_10", "value": 16.875999999999998}, {"type": "recall_at_100", "value": 30.316}, {"type": "recall_at_1000", "value": 60.891}, {"type": "recall_at_3", "value": 10.688}, {"type": "recall_at_5", "value": 13.832}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NQ", "type": "nq", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 36.375}, {"type": "map_at_10", "value": 51.991}, {"type": "map_at_100", "value": 52.91400000000001}, {"type": "map_at_1000", "value": 52.93600000000001}, {"type": "map_at_3", "value": 48.014}, {"type": "map_at_5", "value": 50.381}, {"type": "mrr_at_1", "value": 40.759}, {"type": "mrr_at_10", "value": 54.617000000000004}, {"type": "mrr_at_100", "value": 55.301}, {"type": "mrr_at_1000", "value": 55.315000000000005}, {"type": "mrr_at_3", "value": 51.516}, {"type": "mrr_at_5", "value": 53.435}, {"type": "ndcg_at_1", "value": 40.759}, {"type": "ndcg_at_10", "value": 59.384}, {"type": "ndcg_at_100", "value": 63.157}, {"type": "ndcg_at_1000", "value": 63.654999999999994}, {"type": "ndcg_at_3", "value": 52.114000000000004}, {"type": "ndcg_at_5", "value": 55.986000000000004}, {"type": "precision_at_1", "value": 40.759}, {"type": "precision_at_10", "value": 9.411999999999999}, {"type": "precision_at_100", "value": 1.153}, {"type": "precision_at_1000", "value": 0.12}, {"type": "precision_at_3", "value": 23.329}, {"type": "precision_at_5", "value": 16.256999999999998}, {"type": "recall_at_1", "value": 36.375}, {"type": "recall_at_10", "value": 79.053}, {"type": "recall_at_100", "value": 95.167}, {"type": "recall_at_1000", "value": 98.82}, {"type": "recall_at_3", "value": 60.475}, {"type": "recall_at_5", "value": 69.327}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB QuoraRetrieval", "type": "quora", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 70.256}, {"type": "map_at_10", "value": 83.8}, {"type": "map_at_100", "value": 84.425}, {"type": "map_at_1000", "value": 84.444}, {"type": "map_at_3", "value": 80.906}, {"type": "map_at_5", "value": 82.717}, {"type": "mrr_at_1", "value": 80.97999999999999}, {"type": "mrr_at_10", "value": 87.161}, {"type": "mrr_at_100", "value": 87.262}, {"type": "mrr_at_1000", "value": 87.263}, {"type": "mrr_at_3", "value": 86.175}, {"type": "mrr_at_5", "value": 86.848}, {"type": "ndcg_at_1", "value": 80.97999999999999}, {"type": "ndcg_at_10", "value": 87.697}, {"type": "ndcg_at_100", "value": 88.959}, {"type": "ndcg_at_1000", "value": 89.09899999999999}, {"type": "ndcg_at_3", "value": 84.83800000000001}, {"type": "ndcg_at_5", "value": 86.401}, {"type": "precision_at_1", "value": 80.97999999999999}, {"type": "precision_at_10", "value": 13.261000000000001}, {"type": "precision_at_100", "value": 1.5150000000000001}, {"type": "precision_at_1000", "value": 0.156}, {"type": "precision_at_3", "value": 37.01}, {"type": "precision_at_5", "value": 24.298000000000002}, {"type": "recall_at_1", "value": 70.256}, {"type": "recall_at_10", "value": 94.935}, {"type": "recall_at_100", "value": 99.274}, {"type": "recall_at_1000", "value": 99.928}, {"type": "recall_at_3", "value": 86.602}, {"type": "recall_at_5", "value": 91.133}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClustering", "type": "mteb/reddit-clustering", "config": "default", "split": "test", "revision": "24640382cdbf8abc73003fb0fa6d111a705499eb"}, "metrics": [{"type": "v_measure", "value": 56.322692497613104}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClusteringP2P", "type": "mteb/reddit-clustering-p2p", "config": "default", "split": "test", "revision": "282350215ef01743dc01b456c7f5241fa8937f16"}, "metrics": [{"type": "v_measure", "value": 61.895813503775074}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SCIDOCS", "type": "scidocs", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 4.338}, {"type": "map_at_10", "value": 10.767}, {"type": "map_at_100", "value": 12.537999999999998}, {"type": "map_at_1000", "value": 12.803999999999998}, {"type": "map_at_3", "value": 7.788}, {"type": "map_at_5", "value": 9.302000000000001}, {"type": "mrr_at_1", "value": 21.4}, {"type": "mrr_at_10", "value": 31.637999999999998}, {"type": "mrr_at_100", "value": 32.688}, {"type": "mrr_at_1000", "value": 32.756}, {"type": "mrr_at_3", "value": 28.433000000000003}, {"type": "mrr_at_5", "value": 30.178}, {"type": "ndcg_at_1", "value": 21.4}, {"type": "ndcg_at_10", "value": 18.293}, {"type": "ndcg_at_100", "value": 25.274}, {"type": "ndcg_at_1000", "value": 30.284}, {"type": "ndcg_at_3", "value": 17.391000000000002}, {"type": "ndcg_at_5", "value": 15.146999999999998}, {"type": "precision_at_1", "value": 21.4}, {"type": "precision_at_10", "value": 9.48}, {"type": "precision_at_100", "value": 1.949}, {"type": "precision_at_1000", "value": 0.316}, {"type": "precision_at_3", "value": 16.167}, {"type": "precision_at_5", "value": 13.22}, {"type": "recall_at_1", "value": 4.338}, {"type": "recall_at_10", "value": 19.213}, {"type": "recall_at_100", "value": 39.562999999999995}, {"type": "recall_at_1000", "value": 64.08}, {"type": "recall_at_3", "value": 9.828000000000001}, {"type": "recall_at_5", "value": 13.383000000000001}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB SICK-R", "type": "mteb/sickr-sts", "config": "default", "split": "test", "revision": "a6ea5a8cab320b040a23452cc28066d9beae2cee"}, "metrics": [{"type": "cos_sim_pearson", "value": 82.42568163642142}, {"type": "cos_sim_spearman", "value": 78.5797159641342}, {"type": "euclidean_pearson", "value": 80.22151260811604}, {"type": "euclidean_spearman", "value": 78.5797151953878}, {"type": "manhattan_pearson", "value": 80.21224215864788}, {"type": "manhattan_spearman", "value": 78.55641478381344}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS12", "type": "mteb/sts12-sts", "config": "default", "split": "test", "revision": "a0d554a64d88156834ff5ae9920b964011b16384"}, "metrics": [{"type": "cos_sim_pearson", "value": 85.44020710812569}, {"type": "cos_sim_spearman", "value": 78.91631735081286}, {"type": "euclidean_pearson", "value": 81.64188964182102}, {"type": "euclidean_spearman", "value": 78.91633286881678}, {"type": "manhattan_pearson", "value": 81.69294748512496}, {"type": "manhattan_spearman", "value": 78.93438558002656}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS13", "type": "mteb/sts13-sts", "config": "default", "split": "test", "revision": "7e90230a92c190f1bf69ae9002b8cea547a64cca"}, "metrics": [{"type": "cos_sim_pearson", "value": 84.27165426412311}, {"type": "cos_sim_spearman", "value": 85.40429140249618}, {"type": "euclidean_pearson", "value": 84.7509580724893}, {"type": "euclidean_spearman", "value": 85.40429140249618}, {"type": "manhattan_pearson", "value": 84.76488289321308}, {"type": "manhattan_spearman", "value": 85.4256793698708}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS14", "type": "mteb/sts14-sts", "config": "default", "split": "test", "revision": "6031580fec1f6af667f0bd2da0a551cf4f0b2375"}, "metrics": [{"type": "cos_sim_pearson", "value": 83.138851760732}, {"type": "cos_sim_spearman", "value": 81.64101363896586}, {"type": "euclidean_pearson", "value": 82.55165038934942}, {"type": "euclidean_spearman", "value": 81.64105257080502}, {"type": "manhattan_pearson", "value": 82.52802949883335}, {"type": "manhattan_spearman", "value": 81.61255430718158}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS15", "type": "mteb/sts15-sts", "config": "default", "split": "test", "revision": "ae752c7c21bf194d8b67fd573edf7ae58183cbe3"}, "metrics": [{"type": "cos_sim_pearson", "value": 86.0654695484029}, {"type": "cos_sim_spearman", "value": 87.20408521902229}, {"type": "euclidean_pearson", "value": 86.8110651362115}, {"type": "euclidean_spearman", "value": 87.20408521902229}, {"type": "manhattan_pearson", "value": 86.77984656478691}, {"type": "manhattan_spearman", "value": 87.1719947099227}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS16", "type": "mteb/sts16-sts", "config": "default", "split": "test", "revision": "4d8694f8f0e0100860b497b999b3dbed754a0513"}, "metrics": [{"type": "cos_sim_pearson", "value": 83.77823915496512}, {"type": "cos_sim_spearman", "value": 85.43566325729779}, {"type": "euclidean_pearson", "value": 84.5396956658821}, {"type": "euclidean_spearman", "value": 85.43566325729779}, {"type": "manhattan_pearson", "value": 84.5665398848169}, {"type": "manhattan_spearman", "value": 85.44375870303232}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS17 (en-en)", "type": "mteb/sts17-crosslingual-sts", "config": "en-en", "split": "test", "revision": "af5e6fb845001ecf41f4c1e033ce921939a2a68d"}, "metrics": [{"type": "cos_sim_pearson", "value": 87.20030208471798}, {"type": "cos_sim_spearman", "value": 87.20485505076539}, {"type": "euclidean_pearson", "value": 88.10588324368722}, {"type": "euclidean_spearman", "value": 87.20485505076539}, {"type": "manhattan_pearson", "value": 87.92324770415183}, {"type": "manhattan_spearman", "value": 87.0571314561877}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (en)", "type": "mteb/sts22-crosslingual-sts", "config": "en", "split": "test", "revision": "6d1ba47164174a496b7fa5d3569dae26a6813b80"}, "metrics": [{"type": "cos_sim_pearson", "value": 63.06093161604453}, {"type": "cos_sim_spearman", "value": 64.2163140357722}, {"type": "euclidean_pearson", "value": 65.27589680994006}, {"type": "euclidean_spearman", "value": 64.2163140357722}, {"type": "manhattan_pearson", "value": 65.45904383711101}, {"type": "manhattan_spearman", "value": 64.55404716679305}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSBenchmark", "type": "mteb/stsbenchmark-sts", "config": "default", "split": "test", "revision": "b0fddb56ed78048fa8b90373c8a3cfc37b684831"}, "metrics": [{"type": "cos_sim_pearson", "value": 84.32976164578706}, {"type": "cos_sim_spearman", "value": 85.54302197678368}, {"type": "euclidean_pearson", "value": 85.26307149193056}, {"type": "euclidean_spearman", "value": 85.54302197678368}, {"type": "manhattan_pearson", "value": 85.26647282029371}, {"type": "manhattan_spearman", "value": 85.5316135265568}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB SciDocsRR", "type": "mteb/scidocs-reranking", "config": "default", "split": "test", "revision": "d3c5e1fc0b855ab6097bf1cda04dd73947d7caab"}, "metrics": [{"type": "map", "value": 81.44675968318754}, {"type": "mrr", "value": 94.92741826075158}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SciFact", "type": "scifact", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 56.34400000000001}, {"type": "map_at_10", "value": 65.927}, {"type": "map_at_100", "value": 66.431}, {"type": "map_at_1000", "value": 66.461}, {"type": "map_at_3", "value": 63.529}, {"type": "map_at_5", "value": 64.818}, {"type": "mrr_at_1", "value": 59.333000000000006}, {"type": "mrr_at_10", "value": 67.54599999999999}, {"type": "mrr_at_100", "value": 67.892}, {"type": "mrr_at_1000", "value": 67.917}, {"type": "mrr_at_3", "value": 65.778}, {"type": "mrr_at_5", "value": 66.794}, {"type": "ndcg_at_1", "value": 59.333000000000006}, {"type": "ndcg_at_10", "value": 70.5}, {"type": "ndcg_at_100", "value": 72.688}, {"type": "ndcg_at_1000", "value": 73.483}, {"type": "ndcg_at_3", "value": 66.338}, {"type": "ndcg_at_5", "value": 68.265}, {"type": "precision_at_1", "value": 59.333000000000006}, {"type": "precision_at_10", "value": 9.3}, {"type": "precision_at_100", "value": 1.053}, {"type": "precision_at_1000", "value": 0.11199999999999999}, {"type": "precision_at_3", "value": 25.889}, {"type": "precision_at_5", "value": 16.866999999999997}, {"type": "recall_at_1", "value": 56.34400000000001}, {"type": "recall_at_10", "value": 82.789}, {"type": "recall_at_100", "value": 92.767}, {"type": "recall_at_1000", "value": 99}, {"type": "recall_at_3", "value": 71.64399999999999}, {"type": "recall_at_5", "value": 76.322}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB SprintDuplicateQuestions", "type": "mteb/sprintduplicatequestions-pairclassification", "config": "default", "split": "test", "revision": "d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46"}, "metrics": [{"type": "cos_sim_accuracy", "value": 99.75742574257426}, {"type": "cos_sim_ap", "value": 93.52081548447406}, {"type": "cos_sim_f1", "value": 87.33850129198966}, {"type": "cos_sim_precision", "value": 90.37433155080214}, {"type": "cos_sim_recall", "value": 84.5}, {"type": "dot_accuracy", "value": 99.75742574257426}, {"type": "dot_ap", "value": 93.52081548447406}, {"type": "dot_f1", "value": 87.33850129198966}, {"type": "dot_precision", "value": 90.37433155080214}, {"type": "dot_recall", "value": 84.5}, {"type": "euclidean_accuracy", "value": 99.75742574257426}, {"type": "euclidean_ap", "value": 93.52081548447406}, {"type": "euclidean_f1", "value": 87.33850129198966}, {"type": "euclidean_precision", "value": 90.37433155080214}, {"type": "euclidean_recall", "value": 84.5}, {"type": "manhattan_accuracy", "value": 99.75841584158415}, {"type": "manhattan_ap", "value": 93.4975678585854}, {"type": "manhattan_f1", "value": 87.26708074534162}, {"type": "manhattan_precision", "value": 90.45064377682404}, {"type": "manhattan_recall", "value": 84.3}, {"type": "max_accuracy", "value": 99.75841584158415}, {"type": "max_ap", "value": 93.52081548447406}, {"type": "max_f1", "value": 87.33850129198966}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClustering", "type": "mteb/stackexchange-clustering", "config": "default", "split": "test", "revision": "6cbc1f7b2bc0622f2e39d2c77fa502909748c259"}, "metrics": [{"type": "v_measure", "value": 64.31437036686651}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClusteringP2P", "type": "mteb/stackexchange-clustering-p2p", "config": "default", "split": "test", "revision": "815ca46b2622cec33ccafc3735d572c266efdb44"}, "metrics": [{"type": "v_measure", "value": 33.25569319007206}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB StackOverflowDupQuestions", "type": "mteb/stackoverflowdupquestions-reranking", "config": "default", "split": "test", "revision": "e185fbe320c72810689fc5848eb6114e1ef5ec69"}, "metrics": [{"type": "map", "value": 49.90474939720706}, {"type": "mrr", "value": 50.568115503777264}]}, {"task": {"type": "Summarization"}, "dataset": {"name": "MTEB SummEval", "type": "mteb/summeval", "config": "default", "split": "test", "revision": "cda12ad7615edc362dbf25a00fdd61d3b1eaf93c"}, "metrics": [{"type": "cos_sim_pearson", "value": 29.866828641244712}, {"type": "cos_sim_spearman", "value": 30.077555055873866}, {"type": "dot_pearson", "value": 29.866832988572266}, {"type": "dot_spearman", "value": 30.077555055873866}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB TRECCOVID", "type": "trec-covid", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 0.232}, {"type": "map_at_10", "value": 2.094}, {"type": "map_at_100", "value": 11.971}, {"type": "map_at_1000", "value": 28.158}, {"type": "map_at_3", "value": 0.688}, {"type": "map_at_5", "value": 1.114}, {"type": "mrr_at_1", "value": 88}, {"type": "mrr_at_10", "value": 93.4}, {"type": "mrr_at_100", "value": 93.4}, {"type": "mrr_at_1000", "value": 93.4}, {"type": "mrr_at_3", "value": 93}, {"type": "mrr_at_5", "value": 93.4}, {"type": "ndcg_at_1", "value": 84}, {"type": "ndcg_at_10", "value": 79.923}, {"type": "ndcg_at_100", "value": 61.17}, {"type": "ndcg_at_1000", "value": 53.03}, {"type": "ndcg_at_3", "value": 84.592}, {"type": "ndcg_at_5", "value": 82.821}, {"type": "precision_at_1", "value": 88}, {"type": "precision_at_10", "value": 85}, {"type": "precision_at_100", "value": 63.019999999999996}, {"type": "precision_at_1000", "value": 23.554}, {"type": "precision_at_3", "value": 89.333}, {"type": "precision_at_5", "value": 87.2}, {"type": "recall_at_1", "value": 0.232}, {"type": "recall_at_10", "value": 2.255}, {"type": "recall_at_100", "value": 14.823}, {"type": "recall_at_1000", "value": 49.456}, {"type": "recall_at_3", "value": 0.718}, {"type": "recall_at_5", "value": 1.175}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB Touche2020", "type": "webis-touche2020", "config": "default", "split": "test", "revision": "None"}, "metrics": [{"type": "map_at_1", "value": 2.547}, {"type": "map_at_10", "value": 11.375}, {"type": "map_at_100", "value": 18.194}, {"type": "map_at_1000", "value": 19.749}, {"type": "map_at_3", "value": 5.825}, {"type": "map_at_5", "value": 8.581}, {"type": "mrr_at_1", "value": 32.653}, {"type": "mrr_at_10", "value": 51.32}, {"type": "mrr_at_100", "value": 51.747}, {"type": "mrr_at_1000", "value": 51.747}, {"type": "mrr_at_3", "value": 47.278999999999996}, {"type": "mrr_at_5", "value": 48.605}, {"type": "ndcg_at_1", "value": 29.592000000000002}, {"type": "ndcg_at_10", "value": 28.151}, {"type": "ndcg_at_100", "value": 39.438}, {"type": "ndcg_at_1000", "value": 50.769}, {"type": "ndcg_at_3", "value": 30.758999999999997}, {"type": "ndcg_at_5", "value": 30.366}, {"type": "precision_at_1", "value": 32.653}, {"type": "precision_at_10", "value": 25.714}, {"type": "precision_at_100", "value": 8.041}, {"type": "precision_at_1000", "value": 1.555}, {"type": "precision_at_3", "value": 33.333}, {"type": "precision_at_5", "value": 31.837}, {"type": "recall_at_1", "value": 2.547}, {"type": "recall_at_10", "value": 18.19}, {"type": "recall_at_100", "value": 49.538}, {"type": "recall_at_1000", "value": 83.86}, {"type": "recall_at_3", "value": 7.329}, {"type": "recall_at_5", "value": 11.532}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ToxicConversationsClassification", "type": "mteb/toxic_conversations_50k", "config": "default", "split": "test", "revision": "d7c0de2777da35d6aae2200a62c6e0e5af397c4c"}, "metrics": [{"type": "accuracy", "value": 71.4952}, {"type": "ap", "value": 14.793362635531409}, {"type": "f1", "value": 55.204635551516915}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB TweetSentimentExtractionClassification", "type": "mteb/tweet_sentiment_extraction", "config": "default", "split": "test", "revision": "d604517c81ca91fe16a244d1248fc021f9ecee7a"}, "metrics": [{"type": "accuracy", "value": 61.5365025466893}, {"type": "f1", "value": 61.81742556334845}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB TwentyNewsgroupsClustering", "type": "mteb/twentynewsgroups-clustering", "config": "default", "split": "test", "revision": "6125ec4e24fa026cec8a478383ee943acfbd5449"}, "metrics": [{"type": "v_measure", "value": 49.05531070301185}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterSemEval2015", "type": "mteb/twittersemeval2015-pairclassification", "config": "default", "split": "test", "revision": "70970daeab8776df92f5ea462b6173c0b46fd2d1"}, "metrics": [{"type": "cos_sim_accuracy", "value": 86.51725576682364}, {"type": "cos_sim_ap", "value": 75.2292304265163}, {"type": "cos_sim_f1", "value": 69.54022988505749}, {"type": "cos_sim_precision", "value": 63.65629110039457}, {"type": "cos_sim_recall", "value": 76.62269129287598}, {"type": "dot_accuracy", "value": 86.51725576682364}, {"type": "dot_ap", "value": 75.22922386081054}, {"type": "dot_f1", "value": 69.54022988505749}, {"type": "dot_precision", "value": 63.65629110039457}, {"type": "dot_recall", "value": 76.62269129287598}, {"type": "euclidean_accuracy", "value": 86.51725576682364}, {"type": "euclidean_ap", "value": 75.22925730473472}, {"type": "euclidean_f1", "value": 69.54022988505749}, {"type": "euclidean_precision", "value": 63.65629110039457}, {"type": "euclidean_recall", "value": 76.62269129287598}, {"type": "manhattan_accuracy", "value": 86.52321630804077}, {"type": "manhattan_ap", "value": 75.20608115037336}, {"type": "manhattan_f1", "value": 69.60000000000001}, {"type": "manhattan_precision", "value": 64.37219730941705}, {"type": "manhattan_recall", "value": 75.75197889182058}, {"type": "max_accuracy", "value": 86.52321630804077}, {"type": "max_ap", "value": 75.22925730473472}, {"type": "max_f1", "value": 69.60000000000001}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterURLCorpus", "type": "mteb/twitterurlcorpus-pairclassification", "config": "default", "split": "test", "revision": "8b6510b0b1fa4e4c4f879467980e9be563ec1cdf"}, "metrics": [{"type": "cos_sim_accuracy", "value": 89.34877944657896}, {"type": "cos_sim_ap", "value": 86.71257569277373}, {"type": "cos_sim_f1", "value": 79.10386355986088}, {"type": "cos_sim_precision", "value": 76.91468470434214}, {"type": "cos_sim_recall", "value": 81.4213119802895}, {"type": "dot_accuracy", "value": 89.34877944657896}, {"type": "dot_ap", "value": 86.71257133133368}, {"type": "dot_f1", "value": 79.10386355986088}, {"type": "dot_precision", "value": 76.91468470434214}, {"type": "dot_recall", "value": 81.4213119802895}, {"type": "euclidean_accuracy", "value": 89.34877944657896}, {"type": "euclidean_ap", "value": 86.71257651501476}, {"type": "euclidean_f1", "value": 79.10386355986088}, {"type": "euclidean_precision", "value": 76.91468470434214}, {"type": "euclidean_recall", "value": 81.4213119802895}, {"type": "manhattan_accuracy", "value": 89.35848177901967}, {"type": "manhattan_ap", "value": 86.69330615469126}, {"type": "manhattan_f1", "value": 79.13867741453949}, {"type": "manhattan_precision", "value": 76.78881807647741}, {"type": "manhattan_recall", "value": 81.63689559593472}, {"type": "max_accuracy", "value": 89.35848177901967}, {"type": "max_ap", "value": 86.71257651501476}, {"type": "max_f1", "value": 79.13867741453949}]}]}]} |
RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf | RichardErkhov | null | [
"gguf",
"arxiv:2101.00027",
"arxiv:2201.07311",
"endpoints_compatible",
"region:us"
]
| 2024-11-07T01:19:19 | 2024-11-07T01:30:18 | 103 | 0 | ---
{}
---
Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
pythia-160m-v0 - GGUF
- Model creator: https://huggingface.co/EleutherAI/
- Original model: https://huggingface.co/EleutherAI/pythia-160m-v0/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [pythia-160m-v0.Q2_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q2_K.gguf) | Q2_K | 0.07GB |
| [pythia-160m-v0.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q3_K_S.gguf) | Q3_K_S | 0.08GB |
| [pythia-160m-v0.Q3_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q3_K.gguf) | Q3_K | 0.09GB |
| [pythia-160m-v0.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q3_K_M.gguf) | Q3_K_M | 0.09GB |
| [pythia-160m-v0.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q3_K_L.gguf) | Q3_K_L | 0.09GB |
| [pythia-160m-v0.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.IQ4_XS.gguf) | IQ4_XS | 0.09GB |
| [pythia-160m-v0.Q4_0.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q4_0.gguf) | Q4_0 | 0.1GB |
| [pythia-160m-v0.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.IQ4_NL.gguf) | IQ4_NL | 0.1GB |
| [pythia-160m-v0.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q4_K_S.gguf) | Q4_K_S | 0.1GB |
| [pythia-160m-v0.Q4_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q4_K.gguf) | Q4_K | 0.1GB |
| [pythia-160m-v0.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q4_K_M.gguf) | Q4_K_M | 0.1GB |
| [pythia-160m-v0.Q4_1.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q4_1.gguf) | Q4_1 | 0.1GB |
| [pythia-160m-v0.Q5_0.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q5_0.gguf) | Q5_0 | 0.11GB |
| [pythia-160m-v0.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q5_K_S.gguf) | Q5_K_S | 0.11GB |
| [pythia-160m-v0.Q5_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q5_K.gguf) | Q5_K | 0.12GB |
| [pythia-160m-v0.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q5_K_M.gguf) | Q5_K_M | 0.12GB |
| [pythia-160m-v0.Q5_1.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q5_1.gguf) | Q5_1 | 0.12GB |
| [pythia-160m-v0.Q6_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q6_K.gguf) | Q6_K | 0.13GB |
| [pythia-160m-v0.Q8_0.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q8_0.gguf) | Q8_0 | 0.16GB |
Original model description:
---
language:
- en
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
license: apache-2.0
datasets:
- the_pile
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-160M
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:[email protected]).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change over the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-160M for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-160M as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-160M has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-160M will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-160M to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-160M may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-160M.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).<br>
The Pile was **not** deduplicated before being used to train Pythia-160M.
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure>
| [
"QUESTION_ANSWERING",
"TRANSLATION"
]
| [
"SCIQ"
]
| Non_BioNLP | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
pythia-160m-v0 - GGUF
- Model creator: https://huggingface.co/EleutherAI/
- Original model: https://huggingface.co/EleutherAI/pythia-160m-v0/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [pythia-160m-v0.Q2_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q2_K.gguf) | Q2_K | 0.07GB |
| [pythia-160m-v0.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q3_K_S.gguf) | Q3_K_S | 0.08GB |
| [pythia-160m-v0.Q3_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q3_K.gguf) | Q3_K | 0.09GB |
| [pythia-160m-v0.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q3_K_M.gguf) | Q3_K_M | 0.09GB |
| [pythia-160m-v0.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q3_K_L.gguf) | Q3_K_L | 0.09GB |
| [pythia-160m-v0.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.IQ4_XS.gguf) | IQ4_XS | 0.09GB |
| [pythia-160m-v0.Q4_0.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q4_0.gguf) | Q4_0 | 0.1GB |
| [pythia-160m-v0.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.IQ4_NL.gguf) | IQ4_NL | 0.1GB |
| [pythia-160m-v0.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q4_K_S.gguf) | Q4_K_S | 0.1GB |
| [pythia-160m-v0.Q4_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q4_K.gguf) | Q4_K | 0.1GB |
| [pythia-160m-v0.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q4_K_M.gguf) | Q4_K_M | 0.1GB |
| [pythia-160m-v0.Q4_1.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q4_1.gguf) | Q4_1 | 0.1GB |
| [pythia-160m-v0.Q5_0.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q5_0.gguf) | Q5_0 | 0.11GB |
| [pythia-160m-v0.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q5_K_S.gguf) | Q5_K_S | 0.11GB |
| [pythia-160m-v0.Q5_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q5_K.gguf) | Q5_K | 0.12GB |
| [pythia-160m-v0.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q5_K_M.gguf) | Q5_K_M | 0.12GB |
| [pythia-160m-v0.Q5_1.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q5_1.gguf) | Q5_1 | 0.12GB |
| [pythia-160m-v0.Q6_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q6_K.gguf) | Q6_K | 0.13GB |
| [pythia-160m-v0.Q8_0.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-160m-v0-gguf/blob/main/pythia-160m-v0.Q8_0.gguf) | Q8_0 | 0.16GB |
Original model description:
---
language:
- en
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
license: apache-2.0
datasets:
- the_pile
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-160M
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:[email protected]).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change over the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-160M for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-160M as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-160M has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-160M will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-160M to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-160M may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-160M.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).<br>
The Pile was **not** deduplicated before being used to train Pythia-160M.
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure>
| {} |
medspaner/flair-clinical-trials-temp-ents | medspaner | null | [
"license:cc-by-nc-4.0",
"region:us"
]
| 2023-09-28T17:54:55 | 2024-10-01T06:35:34 | 0 | 0 | ---
license: cc-by-nc-4.0
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: flair-clinical-trials-temp-ents
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# flair-clinical-trials-temp-ents
This named entity recognition model detects temporal expressions (TIMEX) according to the [TimeML scheme](https://en.wikipedia.org/wiki/ISO-TimeML) ([Pustejovsky et al. 2005](http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.85.5610&rep=rep1&type=pdf)), in addition to Age entities:
- Age: e.g. *18 años*
- Date: e.g. *2022*, *26 de noviembre*
- Duration: e.g. *3 horas*
- Frequency: e.g. *semanal*
- Time: e.g. *noche*
The model achieves the following results on the test set (results are averaged over 5 evaluation rounds):
- Precision: 0.899 (±0.007)
- Recall: 0.859 (±0.005)
- F1: 0.879 (±0.006)
- Accuracy: 0.808 (±0.007)
## Model description
This model is fine-tuned to conduct medical named entity recognition on Spanish texts about clinical trials using the [CT-EBM-ES corpus (Campillos-Llanos et al. 2021)](https://bmcmedinformdecismak.biomedcentral.com/articles/10.1186/s12911-021-01395-z).
If you use this model, please, cite as follows:
```
@article{campillosetal2024,
title = {{Hybrid tool for semantic annotation and concept extraction of medical texts in Spanish}},
author = {Campillos-Llanos, Leonardo and Valverde-Mateos, Ana and Capllonch-Carri{\'o}n, Adri{\'a}n},
journal = {BMC Bioinformatics},
year={2024},
publisher={BioMed Central}
}
```
## Intended uses & limitations
**Disclosure**: *This model is under development and needs to be improved. It should not be used for medical decision making without human assistance and supervision*
This model is intended for a generalist purpose, and may have bias and/or any other undesirable distortions.
Third parties who deploy or provide systems and/or services using any of these models (or using systems based on these models) should note that it is their responsibility to mitigate the risks arising from their use. Third parties, in any event, need to comply with applicable regulations, including regulations concerning the use of artificial intelligence.
The owner or creator of the models will in no event be liable for any results arising from the use made by third parties of these models.
**Descargo de responsabilidad**: *Esta herramienta se encuentra en desarrollo y no debe ser empleada para la toma de decisiones médicas*
La finalidad de este modelo es generalista, y se advierte que puede tener sesgos y/u otro tipo de distorsiones indeseables.
Terceras partes que desplieguen o proporcionen sistemas y/o servicios usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) han tener presente que es su responsabilidad abordar y minimizar los riesgos derivados de su uso. Las terceras partes, en cualquier circunstancia, deben cumplir con la normativa aplicable, incluyendo la normativa que concierne al uso de la inteligencia artificial.
El propietario o creador de los modelos de ningún modo será responsable de los resultados derivados del uso que las terceras partes hagan de estos modelos.
## Training and evaluation data
The data used for fine-tuning are the [Clinical Trials for Evidence-Based-Medicine in Spanish corpus](http://www.lllf.uam.es/ESP/nlpdata/wp2/).
It is a collection of 1200 texts about clinical trials studies and clinical trials announcements:
- 500 abstracts from journals published under a Creative Commons license, e.g. available in PubMed or the Scientific Electronic Library Online (SciELO)
- 700 clinical trials announcements published in the European Clinical Trials Register and Repositorio Español de Estudios Clínicos
If you use the CT-EBM-ES resource, please, cite as follows:
```
@article{campillosetal-midm2021,
title = {A clinical trials corpus annotated with UMLS© entities to enhance the access to Evidence-Based Medicine},
author = {Campillos-Llanos, Leonardo and Valverde-Mateos, Ana and Capllonch-Carri{\'o}n, Adri{\'a}n and Moreno-Sandoval, Antonio},
journal = {BMC Medical Informatics and Decision Making},
volume={21},
number={1},
pages={1--19},
year={2021},
publisher={BioMed Central}
}
```
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.1
- train_batch_size: 16
- seed: we used different initializations for 5 evaluation rounds, and uploaded the model with the best results
- num_epochs: average 68.2 epochs (±7.29); trained with early stopping if no improvement after 5 epochs (early stopping patience: 5)
### Training results (test set; average and standard deviation of 5 rounds)
| Precision | Recall | F1 | Accuracy |
|:--------------:|:--------------:|:--------------:|:--------------:|
| 0.899 (±0.007) | 0.859 (±0.005) | 0.879 (±0.006) | 0.808 (±0.007) |
### Framework versions
- FLAIR 0.12
- Pytorch 1.10.2+cu116
| [
"NAMED_ENTITY_RECOGNITION"
]
| [
"SCIELO"
]
| BioNLP | <!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# flair-clinical-trials-temp-ents
This named entity recognition model detects temporal expressions (TIMEX) according to the [TimeML scheme](https://en.wikipedia.org/wiki/ISO-TimeML) ([Pustejovsky et al. 2005](http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.85.5610&rep=rep1&type=pdf)), in addition to Age entities:
- Age: e.g. *18 años*
- Date: e.g. *2022*, *26 de noviembre*
- Duration: e.g. *3 horas*
- Frequency: e.g. *semanal*
- Time: e.g. *noche*
The model achieves the following results on the test set (results are averaged over 5 evaluation rounds):
- Precision: 0.899 (±0.007)
- Recall: 0.859 (±0.005)
- F1: 0.879 (±0.006)
- Accuracy: 0.808 (±0.007)
## Model description
This model is fine-tuned to conduct medical named entity recognition on Spanish texts about clinical trials using the [CT-EBM-ES corpus (Campillos-Llanos et al. 2021)](https://bmcmedinformdecismak.biomedcentral.com/articles/10.1186/s12911-021-01395-z).
If you use this model, please, cite as follows:
```
@article{campillosetal2024,
title = {{Hybrid tool for semantic annotation and concept extraction of medical texts in Spanish}},
author = {Campillos-Llanos, Leonardo and Valverde-Mateos, Ana and Capllonch-Carri{\'o}n, Adri{\'a}n},
journal = {BMC Bioinformatics},
year={2024},
publisher={BioMed Central}
}
```
## Intended uses & limitations
**Disclosure**: *This model is under development and needs to be improved. It should not be used for medical decision making without human assistance and supervision*
This model is intended for a generalist purpose, and may have bias and/or any other undesirable distortions.
Third parties who deploy or provide systems and/or services using any of these models (or using systems based on these models) should note that it is their responsibility to mitigate the risks arising from their use. Third parties, in any event, need to comply with applicable regulations, including regulations concerning the use of artificial intelligence.
The owner or creator of the models will in no event be liable for any results arising from the use made by third parties of these models.
**Descargo de responsabilidad**: *Esta herramienta se encuentra en desarrollo y no debe ser empleada para la toma de decisiones médicas*
La finalidad de este modelo es generalista, y se advierte que puede tener sesgos y/u otro tipo de distorsiones indeseables.
Terceras partes que desplieguen o proporcionen sistemas y/o servicios usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) han tener presente que es su responsabilidad abordar y minimizar los riesgos derivados de su uso. Las terceras partes, en cualquier circunstancia, deben cumplir con la normativa aplicable, incluyendo la normativa que concierne al uso de la inteligencia artificial.
El propietario o creador de los modelos de ningún modo será responsable de los resultados derivados del uso que las terceras partes hagan de estos modelos.
## Training and evaluation data
The data used for fine-tuning are the [Clinical Trials for Evidence-Based-Medicine in Spanish corpus](http://www.lllf.uam.es/ESP/nlpdata/wp2/).
It is a collection of 1200 texts about clinical trials studies and clinical trials announcements:
- 500 abstracts from journals published under a Creative Commons license, e.g. available in PubMed or the Scientific Electronic Library Online (SciELO)
- 700 clinical trials announcements published in the European Clinical Trials Register and Repositorio Español de Estudios Clínicos
If you use the CT-EBM-ES resource, please, cite as follows:
```
@article{campillosetal-midm2021,
title = {A clinical trials corpus annotated with UMLS© entities to enhance the access to Evidence-Based Medicine},
author = {Campillos-Llanos, Leonardo and Valverde-Mateos, Ana and Capllonch-Carri{\'o}n, Adri{\'a}n and Moreno-Sandoval, Antonio},
journal = {BMC Medical Informatics and Decision Making},
volume={21},
number={1},
pages={1--19},
year={2021},
publisher={BioMed Central}
}
```
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.1
- train_batch_size: 16
- seed: we used different initializations for 5 evaluation rounds, and uploaded the model with the best results
- num_epochs: average 68.2 epochs (±7.29); trained with early stopping if no improvement after 5 epochs (early stopping patience: 5)
### Training results (test set; average and standard deviation of 5 rounds)
| Precision | Recall | F1 | Accuracy |
|:--------------:|:--------------:|:--------------:|:--------------:|
| 0.899 (±0.007) | 0.859 (±0.005) | 0.879 (±0.006) | 0.808 (±0.007) |
### Framework versions
- FLAIR 0.12
- Pytorch 1.10.2+cu116
| {"license": "cc-by-nc-4.0", "metrics": ["precision", "recall", "f1", "accuracy"], "model-index": [{"name": "flair-clinical-trials-temp-ents", "results": []}]} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.